CIty Skyline. ben o'bro on Unsplash
SAFELab

Ethics

Ethics and values are integral to everything we do. Learn more about our policies around transparency and how we collect, use, and share data.

Photo Credit (top image): ben o'bro / Unsplash

We acknowledge that identifiable social media data from marginalized populations can be used to criminalize and incarcerate communities of color. We also acknowledge that the data we collect and methods have the potential to cause harm if used improperly or fall into the wrong hands. As such, the SAFElab has developed a set of ethical guidelines describing our research process, collaborations with data science, and dissemination efforts. The nature of our work requires proactive, iterative, and ever changing ethical considerations in order to prevent  any potential harm that may come to communities with which we work and from which our data originates. The following ethical guidelines are a first step in confronting the challenges that arise from our work on social media to include the use of artificial intelligence as a tool for violence prevention in marginalized communities.

Transparency

  • Prioritizing a list of community needs around violence prevention, updating this list as new needs become apparent

  • Describing our data collection and analysis process and the ways the data is used and applied

  • Convening an advisory board of experts in the field, violence prevention workers, and community members (including formerly gang-involved youth) (Monthly meetings)

  • Community validation and evaluation of the decisions around our data analysis and labeling

Data Collection

  • Institutional Review Board (IRB) approval for all of our research studies. However, due to the IRB considering public social media data as exempt, we must find other ways of accountability

  • The social media data we work with comes from hard to reach populations, which makes consent not only hard to obtain, but unreasonable to expect. We are working to find other ways to protect the young people who are involved in our studies:

    • Community consent

    • Family member consent

Data Analysis

  • Password protected annotation system

  • No one outside of our research team has access to the data

  • We are considering having all of our data annotators sign a Memorandum of Understanding (MOU) around the importance of not sharing the data with anyone

  • Weekly conversations on the ethics of our work. We iteratively revisit our work and create space for anyone on the team to bring up any ethical issues, and to address issues brought up by people outside of our research team, including organizational partners and community members

Sharing Data

  • We currently do not share any of our datasets with law enforcements agencies or anyone using punitive and criminalizing methodologies. We will continue to review these practices with our community partners

Research Presentations and Publishable Work

Text Social Media Data

  • No longer using usernames

  • Altering the text of the social media post to render it unsearchable

  • Proactively removing social media posts and users from our dataset who have gone private or been removed/suspended from a certain platform

Image Social Media Data

  • No usernames

  • No pictures of faces

  • Images from our dataset in publishable work

    • No images from our dataset

    • Using similar creative commons images in our publishable work as examples

  • Provide a password protected URL to reviewers with anonymized examples of social media posts and posts with images from our dataset

More in Ethics