While the first generations of tech-for-good work took a solutionist approach to addressing existing problems with new technology, scholars and activists are driving growing awareness of the problems with technology itself. By exposing the negative consequences, intended or otherwise, of tech, these communities draw attention to issues with tech-centric approaches. Not all of the projects here adopt an ethics lens in their work, but we use it here for simplicity's sake.
Suggested reading:
The Santa Clara Principles On Transparency and Accountability in Content Moderation cover various aspects of content moderation, developed by legal scholars and technologists based mostly in the United States, targeting social media companies with large use bases.
Take Back The Tech! is a call to everyone, especially women and girls, to take control of technology to end violence against women.
The Facebook Digital Literacy Library is hosted by Facebook and currently includes learning resources made available by Youth and Media at the Berkman Klein Center for Internet & Society at Harvard University under a Creative Commons Attribution-ShareAlike 4.0 International license.
"Pledge signed by political parties and technology/social network companies committing to avoiding fake news and mechanisms of disinformation that may affect upcoming elections." - Countering Disinfo
Mnemonic is an NGO dedicated to archiving, investigating and memorializing digital information documenting human rights violations and international crimes.
Most tech is designed with the best intentions. But once a product is released and reaches scale, all bets are off. The Risk Mitigation Manual presents eight risk zones where we believe hard-to-anticipate and unwelcome consequences are most likely to emerge.
UNI Global Union's Top Ten Principles for Workers' Data Rights fills an enormous gap with regards workers' rights in the new world of work.
Judgment Call is an award-winning game and team-based activity that puts Microsoft’s AI principles of fairness, privacy and security, reliability and safety, transparency, inclusion, and accountability into action.
We explore societal perspectives surrounding the development and application of digital technology, focusing on ethics, policy, politics, and quality of life. 2017
The Digital Lab Fellowship is a paid, non-resident opportunity to uncover and address emerging consumer harms in the digital world. The Fellowship may be of interest to engineers, computer scientists, information security professionals, independent researchers, academics, social scientists, and others.Â
S.T.O.P. at NYU fights to end discriminatory surveillance.
Coded Bias follows M.I.T. Media Lab computer scientist Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from all over the world, as they fight to expose the discrimination within algorithms now prevalent across all spheres of daily life.Â
Report a dark pattern today. It will help us fight back against companies usingmanipulative dark patterns to take our private information, money, andtime. You deserve respect, online and off.
The unregulated attention economy driving social media is fraying our democracy, threatening our mental and physical health, and exposing our children to violent and disturbing content. Our limited attention has become the most valuable resource on the internet and it is captured and manipulated via the rampant and unregulated collection of our personal data. This is surveillance capitalism at work - a relentless assault on our private data that provide intimate insights which are sold to the highest bidder, with next-to-no awareness or control. We are told that having our data taken is the price we pay for the “free” use of digital services. But this system places the priorities of corporates ahead of the social good while it manipulates our social perspective, drives division and isolates us from each other. There are few practical ways to opt out of the attention economy that depends on pervasive surveillance. And even if you manage to on an individual level, the real world impact of this data-driven social manipulation is impossible to avoid. Big Tech controls a global audience of billions with a market power that is unprecedented in the history of media. Yet they have almost no oversight and reject liability for the harms their products cause.
Everyone is talking about AI, but how and where is it actually being used? We've mapped out interesting examples where AI has been harmful and where it's been helpful.
We want to change the way the internet enables the spread of news and information so it serves the public good over corporate and political interests.
GovLab NYU's Collection of Lectures on the Ethical implications of Data and Artificial Intelligence from Different Perspectives
Access Now defends and extends the digital rights of people and communities at risk. By combining direct technical support, strategic advocacy, grassroots grantmaking, and convenings such as RightsCon, we fight for human rights in the digital age.
A feminist review of AI, privacy and data protection to enhance digital rights
The PIT Lab is building a thoughtful community around public interest technology at Stanford. Themes include systemic inequities, democratic values, bridging the divide, career pathways.
DEDA helps data analysts, project managers and policy makers to recognize ethical issues in data projects, data management and data policies.
Mechanism Design for Social Good (MD4SG) is a multi-institutional initiative using techniques from algorithms, optimization, and mechanism design, along with insights from other disciplines, to improve access to opportunity for historically underserved and disadvantaged communities.
One app to take back control of your data
A guide to the news company's data usage
Feminist.AI works to put technology into the hands of makers, researchers, thinkers and learners to amplify unheard voices and create more accessible AI for all.
A project by AlgorithmWatch that maps frameworks that seek to set out principles of how systems for automated decision-making (ADM) can be developed and implemented ethically
Through public reports and briefings with policymakers, business leaders, and civil society organizations, the Initiative will highlight digital platform reform options, as well as offer expertise on legislative reform proposals and platform self-governance options.
Mozilla Foundation's annual naughty and nice list of consumer tech's privacy policies
The Clinic to End Tech Abuse (CETA) helps survivors of intimate partner violence end tech-related abuse.
The Distributed AI Research Institute is a space for independent, community-rooted AI research free from Big Tech’s pervasive influence.
A set of guidelines that help development practitioners integrate established best practices into technology-enabled programmes.
Consumer tools for online safety & anti-harassment. Automatically hide unwanted tweets & mute the trolls (review later, but only if you want!).
The Social Science Research Council's Just Tech Fellowship supports and mobilizes diverse and cross-sector cohorts of researchers and practitioners to imagine and create more just, equitable, and representative technological futures.
The Privacy Principles for Mobility Data are a set of values and priorities intended to guide the mobility ecosystem in the responsible use of data and the protection of individual privacy.
The Journal of Online Trust and Safety is a cross-disciplinary, open access, fast peer-review journal that publishes research on how consumer internet services are abused to cause harm and how to prevent those harms.
The Tech Worker Handbook is a collection of resources for tech workers who are looking to make more informed decisions about whether to speak out on issues that are in the public interest. Aiming to improve working conditions, direct attention to consumer harms, or otherwise address wrongdoing and abuse should not be a solo or poorly resourced endeavor.
Covering data for social good, data governance, and societal-level data considerations
The Co-op Foundation helps disadvantaged communities overcome their challenges by putting co-operative values into practice. It aims to stimulate and strengthen community action that connects and empowers people so they can work together to make things better.
Reseaching and advocating for critical digital education and youth participation from Latin America and the Majority World
Based in marginalized neighborhoods in Charlotte, North Carolina, Detroit, Michigan, and Los Angeles, California, we look at digital data collection and our human rights, work with local communities, community organizations, and social support networks, and show how different data systems impact re-entry, fair housing, public assistance, and community development.
The free transparency certification