"The Hypothesis Project is a new effort to implement an old idea: A conversation layer over the entire web that works everywhere, without needing implementation by any underlying site."
The Content Blockchain Project was initiated in 2016 by a consortium of publishing, law and IT companies to research the possibilities of using blockchain technologies to advance the content and media ecosystem.
The Technology & Social Change Group (TASCHA) at the University of Washington's Information School explores the role of digital technologies in building more open, inclusive, and equitable societies.
Disinformation, misinformation, and “fake news” are longstanding phenomena that, in the wake of the digital revolution, have become newly politicized and consequential. SSRC's research on the topic.
The Centre for Policy Alternatives (CPA) was formed in the firm belief that there is an urgent need to strengthen institution and capacity-building for good governance and conflict transformation in Sri Lanka and that non-partisan civil society groups have an important and constructive contribution to make to this process.
Supports original research into disinformation vulnerability and resilience with public opinion research, analytical pieces, narrative monitoring, and mainstream and social media monitoring, both in house and in collaboration with local partners in Europe.
AFRICMIL is non-government organisation that focuses on media, information, research, advocacy and training. It aims to promote media and information literacy as a key component in the enhancement of democracy and good governance and the promotion of accountability and orderly society.
The Hamilton 2.0 dashboard, a project of the Alliance for Securing Democracy at the German Marshall Fund of the United States, provides a summary analysis of the narratives and topics promoted by Russian, Chinese, and Iranian government officials and state-funded media on Twitter, YouTube, state-sponsored news websites, and via official diplomatic statements at the United Nations.
We are a joint team of engineers and investigators from CERTH-ITI and Deutsche Welle, trying to build a comprehensive tool for media verification on the Web.
The Global Internet Forum to Counter Terrorism (GIFCT) brings together the technology industry, government, civil society, and academia to foster collaboration and information-sharing to counter terrorist and violent extremist activity online.
EU DisinfoLab is an independent non-profit organisation focused on tackling sophisticated disinformation campaigns targeting the EU, its member states, core institutions, and core values.
DUBAWA is Nigeria’s independent verification and fact-checking platform, initiated by the Premium Times Centre for Investigative Journalism (PTCIJ) and supported by the most influential newsrooms and civic organisations in the country to help amplify the culture of truth in public discourse, public policy, and media practice.
The Digital Democracy Room is an initiative of FGV DAPP to monitor the public debate on the internet and fight disinformation strategies which threaten the integrity of political and electoral processes, seeking to strengthen the democratic institutions.
DEBUNK (demaskuok in Lithuanian) is a unique Lithuania-born initiative uniting competing media outlets, journalists, volunteers for a single purpose – to make society more resilient to orchestrated disinformation campaigns.
This project is an experiment in experiential learning through a cyber verification lab at the University of Hong Kong’s Journalism and Media Studies Centre.
Harvard Kennedy School, John F. Kennedy Street, Cambridge, MA, USA
Case studies. The Media Manipulation Casebook is a digital research platform linking together theory, methods, and practice for mapping media manipulation and disinformation campaigns.
University of Oxford, 1 St Giles', Oxford OX1 3JS, UK
Since 2012, the Programme on Democracy & Technology has been investigating the use of algorithms, automation, and computational propaganda in public life.
Tattle is a group of technologists, researchers, journalists and artists. We build tools and datasets to better understand and respond to (mis)information trends on chat apps and regional language social media in India.
Social Science One implements a new type of partnership between academic researchers and the private sector to advance the goals of social science in understanding and solving society’s greatest challenges.
The COR curriculum was developed by the Stanford History Education Group as part of MediaWise—a partnership of SHEG, the Poynter Institute, and the Local Media Association.
South Asia Check is an independent, non-partisan, non-profit initiative by Panos South Asia which aims to promote accuracy and accountability in public debate.
A programme aimed at circumventing our geographic limitations and strengthening regional connection and participation within the African civic tech community.
Focuses on bringing together practicing technologists and researchers to facilitate understanding of the complex interaction between technology, science and society, its impact on individuals and society in general, professional and social responsibility in the practice of engineering, science, and technology, and open discussion on the resulting issues.
Vote Lab is the innovation and research arm of When We All Vote. It was built to ensure that our voter engagement programs are informed by the best available research and evidence and creates opportunities to drive new experimentation and learning.
The XR Collaboratory (XRC) at Cornell Tech works with faculty, researchers and students from computer vision, computer graphics, and human-computer interaction, as well as practitioners in application areas such as healthcare, education, and architecture.
Discover how deepfakes work and the visual clues you can use to identify them. We are a group of communication designers that have created this project to demonstrate our research into making our own deep fake, and to communicate the signs you can spot to identify them.