Social media could be erasing evidence of war crimes
In recent years, social media platforms have been taking down online content more often and more quickly, often in response to the demands of governments, but in a way that prevents the use of that content to investigate people suspected of involvement in serious crimes, including war crimes. While it is understandable that these platforms remove content that incites or promotes violence, they are not currently archiving this material in a manner that is accessible for investigators and researchers to help hold perpetrators to account.
Social media content, particularly photographs and videos, posted by perpetrators, victims, and witnesses to abuses, as well as others has become increasingly central to some prosecutions of war crimes and other international crimes, including at the International Criminal Court (ICC) and in national proceedings in Europe. This content also helps media and civil society document atrocities and other abuses, such as attacks in Syria, a security force crackdown in Sudan, and police abuse in the United States.
Yet social media companies have ramped up efforts to permanently remove posts from their platforms that they consider violate their rules, or community guidelines or standards according to their terms of service, including content they consider to be “terrorist and violent extremist content” (TVEC), hate speech, organized hate, hateful conduct, and violent threats. According to the companies, they not only take down material that content moderators classify for removal. Increasingly, they also use algorithms to identify and remove content so quickly that no user sees it before it is taken down. In addition, some platforms have filters to prevent content identified as TVEC and other relevant content from being uploaded in the first place. Governments globally have encouraged this trend, calling on companies to take down content as quickly as possible, particularly since March 2019, when a gunman livestreamed his attack on two mosques in Christchurch, New Zealand that killed 51 people and injured 49 others.
Companies are right to promptly remove content that could incite violence, otherwise harm individuals, or jeopardize national security or public order. But the social media companies have failed to set up mechanisms to ensure that the content they take down is preserved, archived, and made available to international criminal investigators. In most countries, national law enforcement officials can compel the companies to hand over the content through the use of warrants, subpoenas, and court orders, but international investigators have limited ability to access the content because they lack law enforcement powers and standing.
Law enforcement officers and others are also likely to be missing important information and evidence that would have traditionally been in the public domain because increasingly sophisticated artificial intelligence systems are taking down content before any of them have a chance to see it or even know that it exists. There is no way of knowing how much potential evidence of serious crimes is disappearing without anyone’s knowledge.
Independent civil society organizations and journalists have played a vital role in documenting atrocities in Iraq, Myanmar, Syria, Yemen, Sudan, the United States, and elsewhere – often when there were no judicial actors conducting investigations. In some cases, the documentation of organizations and the media has later triggered judicial proceedings. However, they also have no ability to access removed content. Access to this content by members of the public should be subject to careful consideration, and removal may be appropriate in some cases. But when the content is permanently removed and investigators have no way of accessing it, this could hamper important accountability efforts.
Companies have responded to some civil society requests for access to content either by reconsidering its takedown and reposting it, or by saying that it is illegal for them to share the content with anyone. Human Rights Watch is not aware of any instances where companies have agreed to provide independent civil society and journalists access to such content if it was not reposted.
It is unclear how long the social media companies save content that they remove from their platforms before deleting it from their servers or even whether the content is, in fact, ever deleted from their servers. Facebook states that, upon receipt of a valid request, it will preserve the content for 90 days following its removal, “pending our receipt of [a] formal legal process.” Human Rights Watch knows, however, of instances in which Facebook has retained on its servers content taken down for periods much longer than 90 days after removal. In an email to Human Rights Watch on August 13, a Facebook representative said, “Due to legislative restrictions on data retention we are only permitted to hold content for a certain amount of time before we delete it from our servers. This time limit varies depending on the abuse type… retention of this data for any additional period can be requested via a law enforcement preservation request.”
In an email to Human Rights Watch on August 4, Twitter said it, “retains different types of information for different lengths of time, and in accordance with our Terms of Service and Privacy Policy.” In at least one instance that Human Rights Watch is aware of, YouTube restored content two years after it had taken it down.
Holding individuals accountable for serious crimes may help deter future violations and promote respect for the rule of law. Criminal justice also assists in restoring dignity to victims by acknowledging their suffering and helping to create a historical record that protects against revisionism by those who will seek to deny that atrocities occurred.
However, both nationally and internationally, victims of serious crimes often face an uphill battle when seeking accountability, especially during situations of ongoing conflict. Criminal investigations sometimes begin years after the alleged abuses were committed. It is likely that by the time these investigations occur, social media content with evidentiary value will have been taken down long before, making the proper preservation of this content, in line with standards that would be accepted in court, all the more important.
International law obligates countries to prosecute genocide, crimes against humanity, and war crimes. In line with a group of civil society organizations who have been engaging with social media companies on improving transparency and accountability around content takedowns since 2017, Human Rights Watch urges all stakeholders, including social media platforms, to engage in a consultation to develop a mechanism to preserve potential evidence of serious crimes and ensure it is available to support national and international investigations, as well as documentation efforts by civil society organizations, journalists, and academics.
The mechanism in the US to preserve potential evidence of child sexual exploitation posted online provides important lessons for how such a mechanism could work. US-registered companies operating social media platforms are required to take down content that shows child sexual exploitation, but also preserve it on their platforms for 90 days and share a copy of the content, as well as all relevant metadata—for example, the name of the content’s author, the date it was created, and the location—and user data, with the National Center for Missing and Exploited Children (NCMEC). The NCMEC, a private nonprofit organization, has a federally designated legal right to possess such material indefinitely, and, in turn, notifies law enforcement locally and internationally about relevant content that could support prosecutions.
A mechanism to preserve publicly posted content that is potential evidence of serious crimes could be established through collaboration with an independent organization that would be responsible for storing the material and sharing it with relevant actors. An upcoming report from the Human Rights Center at the University of California, Berkeley, “Digital, Lockers: Options for Archiving Social Media Evidence of Atrocity Crimes,” studies the possible archiving models for this, creating a typology of five archive models, and assessing strengths and weaknesses of each.
In parallel with these efforts, social media platforms should be more transparent about their existing takedown mechanisms, including through the increased use of algorithms, and work to ensure that they are not overly broad or biased and provide meaningful opportunities to appeal content takedowns.