1991 Broadway Street, Suite 200, Redwood City, California 94063
The Tech We Want is four-year, $8 million portfolio of work focused on connecting and empowering a new wave of leaders, companies, and technologies that are built on inclusivity, mutualism, sustainability, accountability, and responsible innovation.
United Kingdom of Great Britain and Northern Ireland (the)
"In only a decade, the labour market has changed beyond all recognition - from zero-hour contracts to platform monopolies - meanwhile union memberships are at historic lows."
Policy decisions on and resource allocations for IEM tend to be made with inadequate data. The focus of the IDS programme is to coordinate a transboundary effort to develop indigenous data with indigenous stewardship.
The Civics of Technology Project shares research and curriculum and offers professional development that encourages teachers and students to critically inquire into the effects of technology on our individual and collective lives.
What the Future Wants is an interactive youth focused exhibition that presents different perspectives on technology from the personal, to the political, to the planetary.
Prior to 2019 elections, the Election Commission of India was able to convene representatives from top social media companies for a two-day brainstorming session on approaches to problematic social media content in elections.
The Santa Clara Principles On Transparency and Accountability in Content Moderation cover various aspects of content moderation, developed by legal scholars and technologists based mostly in the United States, targeting social media companies with large use bases.
Mnemonic is an NGO dedicated to archiving, investigating and memorializing digital information documenting human rights violations and international crimes.
The Facebook Digital Literacy Library is hosted by Facebook and currently includes learning resources made available by Youth and Media at the Berkman Klein Center for Internet & Society at Harvard University under a Creative Commons Attribution-ShareAlike 4.0 International license.
"Pledge signed by political parties and technology/social network companies committing to avoiding fake news and mechanisms of disinformation that may affect upcoming elections." - Countering Disinfo
Institute for the Future, Hamilton Avenue, Palo Alto, CA, USA
Most tech is designed with the best intentions. But once a product is released and reaches scale, all bets are off. The Risk Mitigation Manual presents eight risk zones where we believe hard-to-anticipate and unwelcome consequences are most likely to emerge.
Judgment Call is an award-winning game and team-based activity that puts Microsoft’s AI principles of fairness, privacy and security, reliability and safety, transparency, inclusion, and accountability into action.
We explore societal perspectives surrounding the development and application of digital technology, focusing on ethics, policy, politics, and quality of life.
2017
Resident Fellows seek to advance society's understanding of surveillance capitalism and change the way Reset works by embedding within Reset to produce creative research and technology outputs.
In this report, we dive into the history of public investment in technologies at the foundation of Big Tech, and the imbalances between these investments and the returns to the public sector.
Coded Bias follows M.I.T. Media Lab computer scientist Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from all over the world, as they fight to expose the discrimination within algorithms now prevalent across all spheres of daily life.
Report a dark pattern today. It will help us fight back against companies usingmanipulative dark patterns to take our private information, money, andtime. You deserve respect, online and off.
Hosted by Caterina Fake, Should This Exist? is a show that takes a single technology and asks: What is its greatest potential? And what could possibly go wrong?
The New School, East 13th Street, New York, NY, USA
The Digital Equity Laboratory uncovers and addresses structural inequities that persist and evolve as technology transforms our cultural, social, and political systems.
The unregulated attention economy driving social media is fraying our democracy, threatening our mental and physical health, and exposing our children to violent and disturbing content.
Everyone is talking about AI, but how and where is it actually being used? We've mapped out interesting examples where AI has been harmful and where it's been helpful.
The Myanmar Tech Accountability Network (MTAN) is a network of Myanmar civil society organizations coordinating efforts to mitigate the risk of social media induced violence and political instability in Myanmar.
United Kingdom of Great Britain and Northern Ireland (the)
UnBias aims to provide policy recommendations, ethical guidelines and a ‘fairness toolkit’ co-produced with young people and other stakeholders that will include educational materials and resources to support youth understanding about online environments as well as raise awareness among online providers about the concerns and rights of young internet users.
A global forum to discuss and debate digital transformation within the humanitarian sector, with a focus on data protection laws and humanitarian protection, policy, ethics and action.
Humanity Hub, Fluwelen Burgwal, The Hague, Netherlands
How can humanitarian organizations, states, civil society, academia and the private sector join forces to maximize the benefits of technology and humanitarian data while minimizing the risks of doing harm?
This project attempts to document all collective action from workers in the tech industry. Contribute to our archive. Currently, there are 506 collective actions documented.
The Technology and Public Purpose (TAPP) Fellowship provides a unique opportunity for practitioners at the intersection of responsible technology development to explore multidisciplinary approaches to maximizing the societal benefits of emerging technologies while minimizing the harms.
This class uses the idea of values-driven design to help creators consider the politics of the technologies we use and which we bring into the world and teaches methods of research, design and deployment intended to help technologies meet the needs of real-world communities.
Access Now defends and extends the digital rights of people and communities at risk. By combining direct technical support, strategic advocacy, grassroots grantmaking, and convenings such as RightsCon, we fight for human rights in the digital age.
Stanford University, Serra Mall, Stanford, CA, USA
The PIT Lab is building a thoughtful community around public interest technology at Stanford. Themes include systemic inequities, democratic values, bridging the divide, career pathways.
Mechanism Design for Social Good (MD4SG) is a multi-institutional initiative using techniques from algorithms, optimization, and mechanism design, along with insights from other disciplines, to improve access to opportunity for historically underserved and disadvantaged communities.