Harassment Manager is a web application that aims to empower users to document and take action on abuse targeted at them on online platforms.

Online abuse and harassment silence important voices in conversation, forcing already marginalized people offline.

Harassment Manager is a web application that aims to empower users to document and take action on abuse targeted at them on online platforms. It is designed for anyone that experiences significant online harassment, which can be episodic or an ongoing challenge. The tool has been built and tested using a community based research and design process with active Twitter users that experience significant and/or frequent harassment (more details about our research process and published results are available on Arxiv). Target users include folks who are disproportionately impacted by online harassment such as women and other marginalized groups, journalists, activists, and politicians.

This web app was built by Jigsaw, a unit within Google that explores threats to open societies and builds technology that inspires scalable solutions, in collaboration with Twitter. To detect potentially harmful comments, it uses Jigsaw’s Perspective API, which uses machine learning to identify “toxic” language. We define toxicity as language that is rude, disrespectful or likely to make someone leave a conversation. You can read our model cards to learn more about how our machine learning models are trained.

Status: Active
Parent Organization: Code for Africa
Open Source: Yes
Last Modified: 4/18/2023
Added on: 7/6/2022

Project Categories

Back to Top