While the first generations of tech-for-good work took a solutionist approach to addressing existing problems with new technology, scholars and activists are driving growing awareness of the problems with technology itself. By exposing the negative consequences, intended or otherwise, of tech, these communities draw attention to issues with tech-centric approaches. Not all of the projects here adopt an ethics lens in their work, but we use it here for simplicity's sake.
Suggested reading: Technology Ethics in Action: Critical and Interdisciplinary Perspectives. Edited by Ben Green.
We set ourselves a task to create a short, simple, and practical resource to help teams who are designing projects to identify potential risks and harms.
Coda Story reports on major currents shaping our world from disinformation to authoritarian technologies to the war on science. Coda stays on these stories to reveal why they matter, how they are connected and where they are heading next.
The Tech We Want is four-year, $8 million portfolio of work focused on connecting and empowering a new wave of leaders, companies, and technologies that are built on inclusivity, mutualism, sustainability, accountability, and responsible innovation.
An interactive scorecard where people can see if they would be flagged for social welfare fraud under the Netherlands' broken system
Policy decisions on and resource allocations for IEM tend to be made with inadequate data. The focus of the IDS programme is to coordinate a transboundary effort to develop indigenous data with indigenous stewardship.
A library of design interventions to encourage prosocial behaviors online
The Civics of Technology Project shares research and curriculum and offers professional development that encourages teachers and students to critically inquire into the effects of technology on our individual and collective lives.
What the Future Wants is an interactive youth focused exhibition that presents different perspectives on technology from the personal, to the political, to the planetary.
The Santa Clara Principles On Transparency and Accountability in Content Moderation cover various aspects of content moderation, developed by legal scholars and technologists based mostly in the United States, targeting social media companies with large use bases.