Machine Learning Tool Helps Human Rights Workers Seek Justice
By Scottie Barsotti
Interdisciplinary researchers at CMU created a tool that can scan thousands of hours of multimedia in a matter of minutes. It can help human rights practitioners build cases against war criminals.
A human rights lawyer analyzes a video showing clear evidence of a war crime, committed by a soldier against a group of civilians. She wants to know: was this an isolated incident, the actions of a single rogue soldier or military unit? Or was it part of something bigger, part of a broader campaign or policy targeting a specific population? Evidence is the key to building a compelling case. For human rights workers who may be attempting to hold powerful military leaders and heads of state accountable, that evidence has to be air-tight.
A technological tool created by a team of CMU computer scientists known as E-LAMP (Event Labeling and Media Processing) can help human rights practitioners filter enormous amounts of media in order to find the most relevant images with which to build their case.
“E-LAMP looks at media and tries to detect objects in the videos: whether that’s people, cars, buildings, pedestrians, military vehicles, and so on,” said Alexander Hauptmann, a professor in CMU’s Language Technologies Institute and one of the creators of E-LAMP.
Social media platforms, video sharing sites, and smartphones have made it easier than ever for people to create images, videos, and other media assets in real time and then upload those assets to the internet.
However, the sheer volume of content being recorded and shared means that the number of videos human rights practitioners are able to gather was outpacing their ability to review them.
“One of the key elements of E-LAMP is that it allows you to train classifiers to look for exactly what you want it to look for. You could specify ‘helicopter in the sky’ and “explosion” and the system will find videos that fit that criteria,” said Hauptmann.
Several years ago, Hauptmann began a partnership with Jay Aronson, a CMU professor of science, technology, and society who runs the Center for Human Rights Science—together, they sought to apply E-LAMP to the human rights context.
Aronson says that what makes the system valuable to human rights work is the ability to train E-LAMP—it allows people who are reviewing media to search for videos that contain very specific things, such as logos from militant groups or citizen media outlets, people crowded into small spaces, gunshots, gas, and other information that can point toward a potential abuse. In that sense, Aronson compares E-LAMP to a sieve that gets practitioners to the most relevant information as quickly as possible. He also notes that a tool like E-LAMP is not a magic bullet that can replace human analysis or judgment, but rather that it allows practitioners more time to engage in meaningful human rights work rather than spending hours upon hours poring over video footage.
“The human rights community is small and lacks funding, so a technology that allows you to amplify human effort is so important,” said Aronson. He says that the realities of human rights work is such that organizations are often responding to a particular situation as it’s unfolding. But in circumstances when an institution like the International Criminal Court, Human Rights Watch, Amnesty International, or a UN investigatory body is conducting a larger investigation and trying to build, for example, a war crimes case against a military leader, E-LAMP can help them find relevant information to build a stronger case and find it in less time. Essentially, it allows them the time to analyze pertinent media in greater depth, and to find connections between incidents that may seem disparate.
“In order to achieve justice in an international context, you need to be able to say ‘this person is an intellectual author of a policy that caused these widespread harms.’ International justice is slow and imperfect, but the more information you have, the more you can build a strong case that these things are not isolated incidents,” said Aronson.
Helping Technology Gain Traction and Trust
Hauptmann says that certain technological advancements, including improvements in computing power and machine learning, make E-LAMP possible.
“We can make substantially better and more accurate systems as opposed to 15 years ago,” he said.
Still, communities of practice like the human rights community can be conservative when it comes to adopting new technologies—conservatism that Aronson says is justified because so much is at stake.
“There’s a fear of getting something wrong and undermining an investigation,” said Aronson. He remarks that while it has taken time, trust is building as practitioners learn more about applications of technology. Aronson says that tools like E-LAMP can level the playing field in a sense by giving the human rights community access to some of the same tools that militaries and governments are using. He created the Center for Human Rights Science to be a place where the human rights community can come together with researchers like Hauptmann to learn about each other’s work.
“Just seeing what Alex and his team can do has opened up people’s ideas of how technology can be useful, so now people are coming to us with ideas that can shape what Alex’s team works on and develops. The interactions are very valuable, and they can create new research agendas,” said Aronson.
These interactions are happening at a critical time. As ethical uses of AI and efforts to use technology for public good gain greater visibility and research support, it’s essential to bring people together who can speak to the possible implications of technology.
“Technologies can be used for public safety or by human rights and civil rights people, but they can also be used by governments to suppress things or prosecute certain groups. The technology sits in the middle, and that can be problematic,” said Hauptmann.
Aronson agrees, noting that technologies are inherently biased toward benefitting those who already have power.
"So, if you’re working in this space, you have to be conscious that tools you’re creating don’t just further increase that power,” said Aronson. “They have to be accessible and useful to the people who are trying to hold the powerful to account. These tools can potentially have a positive impact.”
Aronson and Hauptmann agree that while technologists have historically operated without that explicit mindset, things are changing. They both remark that they are heartened by the sensitivity CMU students largely have to issues of transparency, imbalance of power, and justice and the role of technology in promoting equity.
“We should have been having these conversations within the scientific and technical community 20-30 years ago, but I'm glad they are happening now,” said Aronson.
“My hope is that moving forward, rather than trying to scramble after something has already been in the wild causing harm and trying to redress those issues, we try our best to stop things from happening, even if that includes not deploying systems in the first place. And I think we’re starting to see that.”