The CDEI is a UK government expert body enabling the trustworthy use of data and AI.
Our mission is to advance, defend, and sustain the right to ethically study the impact of technology on society.
The European Centre for Algorithmic Transparency (ECAT) will contribute to a safer, more predictable and trusted online environment for people and business.
What is a more ambitious vision for data use and regulation that can deliver a positive shift in the digital ecosystem towards people and society?
Superbloom leverages design as a transformative practice to shift power in the tech ecosystem, because everyone deserves technology they can trust.
While each organization is responsible for its own data, humanitarians under the Inter-Agency Standing Committee (IASC) – which brings together United Nations (UN) entities, Non-Governmental Organization (NGO) consortia and the International Red Cross and Red Crescent Movement – need common normative, system-wide guidance to inform individual and collective action and to uphold a high standard for data responsibility in different operating environments.
The Data Responsibility Working Group (DRWG) is a global coordination body working to advance data responsibility across the humanitarian system.
From biometrics to surveillance — when people in power abuse technology, the rest of us suffer. Written by Ellery Biddle.
Short for “human experience,” HX is an approach to talking about, engaging with, and designing technology in a way that is aligned with our needs as humans — not users.
All Tech Is Human has developed this free program in order to help build the Responsible Tech pipeline – by facilitating connections and career development among talented students, career changers, and practitioners.
The All Tech Is Human Library Podcast is a special 16-part series featuring a series of rapid-fire intimate conversations with academics, AI ethicists, activists, entrepreneurs, public interest technologists, and integrity workers, who help us answer: How do we build a responsible tech future?
ParityBOT is a Twitter bot that spins the abuse and toxicity directed at women in politics into positive, uplifting and encouraging messages. The artificial intelligence technology that powers ParityBOT detects and classifies hateful, harmful and toxic tweets directed at women in leadership or public office. For every toxic tweet that passes a certain threshold, the bot posts a “positivitweet.” ParityBOT has been deployed in Canada during the federal election in 2019, and the Alberta election in 2019. During this time: +245,000 tweets were processed 393 candidates were tracked +20,000 positive tweets sent
Knowing without Seeing is a research project by Amber Sinha which explores meaningful transparency solutions for opaque algorithms, and privileges comprehension over mere access to information.
The mission of the Coalition for Independent Tech Research is to advance, defend, and sustain the right to ethically study the impact of technology on society.
Berkman Klein center research effort: GLOBAL COHORT OF EARLY-CAREER SCHOLARS AND PRACTITIONERS EXPLORES THE ETHICAL AND HUMAN RIGHTS CONSIDERATIONS OF DIGITAL IDENTITY IN TIMES OF CRISIS
The Digital Freedom Fund and its partner European Digital Rights (EDRi) are in the second phase of an initiative that emerged to decolonise the digital rights field.
We set ourselves a task to create a short, simple, and practical resource to help teams who are designing projects to identify potential risks and harms.
Leveraging digital data in ways that advance your mission and respect the rights of the people you serve is a core capacity of foundations and nonprofits in the 21st century. That’s why the Digital Civil Society Lab at the Stanford Center on Philanthropy and Civil Society created the Digital Impact Toolkit—to support civil society organizations in using digital data ethically, safely, and effectively.
Coda Story reports on major currents shaping our world from disinformation to authoritarian technologies to the war on science. Coda stays on these stories to reveal why they matter, how they are connected and where they are heading next.
The Tech We Want is four-year, $8 million portfolio of work focused on connecting and empowering a new wave of leaders, companies, and technologies that are built on inclusivity, mutualism, sustainability, accountability, and responsible innovation.
Community-led guides for data science and open research
An interactive scorecard where people can see if they would be flagged for social welfare fraud under the Netherlands' broken system
Policy decisions on and resource allocations for IEM tend to be made with inadequate data. The focus of the IDS programme is to coordinate a transboundary effort to develop indigenous data with indigenous stewardship.
A library of design interventions to encourage prosocial behaviors online
The Civics of Technology Project shares research and curriculum and offers professional development that encourages teachers and students to critically inquire into the effects of technology on our individual and collective lives.
What the Future Wants is an interactive youth focused exhibition that presents different perspectives on technology from the personal, to the political, to the planetary.
Faced with the choice between privacy and safety on the Internet, between freely expressing themselves and the ethical use of information, the media and technology – women, men and young boys and girls need new types of competencies.
The Santa Clara Principles On Transparency and Accountability in Content Moderation cover various aspects of content moderation, developed by legal scholars and technologists based mostly in the United States, targeting social media companies with large use bases.
Take Back The Tech! is a call to everyone, especially women and girls, to take control of technology to end violence against women.
Mnemonic is an NGO dedicated to archiving, investigating and memorializing digital information documenting human rights violations and international crimes.
The Trust Project develops transparency standards that help people assess the quality and credibility of journalism.
The Magnifier is the first news agency in Brazil to specialize in journalistic technique known worldwide as fact-checking and was founded on November 1, 2015.
The Facebook Digital Literacy Library is hosted by Facebook and currently includes learning resources made available by Youth and Media at the Berkman Klein Center for Internet & Society at Harvard University under a Creative Commons Attribution-ShareAlike 4.0 International license.
Você sabia que muitos problemas da gestão pública podem ser solucionados com Inteligência Artificial ?
The Myanmar Tech Accountability Network (MTAN) is a network of Myanmar civil society organizations coordinating efforts to mitigate the risk of social media induced violence and political instability in Myanmar.
THE UNSEEN TEEN: The Challenges of Building Healthy Tech for Young People a report by Data & Society
Mozilla Rally is aimed at rebuilding your equity in your data. We allow you to choose how to contribute your data and for what purpose.
The unregulated attention economy driving social media is fraying our democracy, threatening our mental and physical health, and exposing our children to violent and disturbing content. Our limited attention has become the most valuable resource on the internet and it is captured and manipulated via the rampant and unregulated collection of our personal data. This is surveillance capitalism at work - a relentless assault on our private data that provide intimate insights which are sold to the highest bidder, with next-to-no awareness or control. We are told that having our data taken is the price we pay for the “free” use of digital services. But this system places the priorities of corporates ahead of the social good while it manipulates our social perspective, drives division and isolates us from each other. There are few practical ways to opt out of the attention economy that depends on pervasive surveillance. And even if you manage to on an individual level, the real world impact of this data-driven social manipulation is impossible to avoid. Big Tech controls a global audience of billions with a market power that is unprecedented in the history of media. Yet they have almost no oversight and reject liability for the harms their products cause.
This 7 point framework will help government departments with the safe, sustainable and ethical use of automated or algorithmic decision-making systems.