The Facebook logo is displayed at the Facebook Innovation Hub on February 24, 2016, in Berlin, Germany. Photo: Sean Gallup/Getty Images
Nearly two years ago, 23-year-old Black mother Korryn Gaineswas executed in her own home. Police also shot her five-year-old son. I, like many folks, was horrified to read about yet another Black mother dead at the hands of the police—Korryn was the ninth Black woman killed by law enforcement officers in 2016.
I was also enraged at the role Facebook played in destroying this young family. Facebook’s opaque policies on the use of its platform had spilled offline in a violent and disturbing way—and it wasn’t the first or the last time.
Gaines was live streaming as officers busted into her home to serve a warrant on charges stemming from a traffic stop. In the middle of the standoff, police officials asked Facebook to suspend Gaines’ accounts via what they called a “law enforcement portal,” a part of the site only open to certified law enforcement agencies.
Facebook granted the emergency request and took her account offline. In doing so, Facebook removed one of the most important tools Gaines had to hold the police accountable and gave police the license to kill Gaines with no accountability.
Facebook has long been complicit in violence against Black people, both online and offline. The platform is a safe space for White supremacists to spew anti-Black vitriol, organize Klan rallies, and encourage physical violence against Black people without reprimand from Facebook.
We arrived armed with the stories of Black activists who’d been doxed, their private information and physical location shared in White nationalist Facebook groups with posts encouraging violence against them. This was discovered when these racists showed up to events and even people’s workplaces, armed with guns. Even though these pages and posts were repeatedly flagged by groups and those personally affected, they were allowed to stay up.
When we confronted Facebook about three years ago, the company’s response was that content moderators in places like India and Ireland didn’t always understand what racism in the United States looks like, but we could trust they were looking into it. I trust they were still “looking into it,” when some of those same figures and groups operating back in 2015 used the platform to wreak havoc, chaos and violence in places like Portland, Berkeley and Charlottesville.
In 2016, after the murder of Korryn Gaines, we made our fourth trip to Facebook. Tired of excuses, we joined with others to demand that leadership commission an independent audit of the site’s processes for addressing extremism and other hate-based content and make public the results of the audit and actionable next steps and timelines. The response to the 70 groups making the request: “Trust us, we’re still looking into it.” We knew then that we had to keep the pressure on them. And that’s what we did.
We have spent countless hours in meetings with Facebook executives, we wrote letters and emails, and employed public tactics, all in hopes of getting Facebook to protect its Black and Brown users’ safety and civil rights. We joined with even more groups late last year to repeat our urgent call for a public, transparent, independent audit of their practices and policies.
Facebook’s announcement proves that organized Black and Brown communities can hold multibillion-dollar companies accountable and pressure them to take definitive steps to end dangerous and discriminatory practices while setting new standards for the rules of engagement on their platforms.
To be sure, this is just a first step. The appointment of the Heritage Foundation—which has a history of exploiting anti-Black, anti-Immigrant and anti-Muslim narratives to create discriminatory policy—to investigate unprovenissues of liberal bias plays into party politics. This theater of the absurd detracts from the real issue: ensuring that all people can feel safe online without the fear of being targeted by nefarious actors. In essence, Facebook has created a false political binary between threats to the safety of people on and offline, and Diamond and Silk’s right to spread and monetize propaganda. Without strong oversight, the result of the audit could have a negative impact on safety for our communities.
Facebook must commit to working with civil rights and racial justice organizations to discuss solutions to the problems examined by the audit and to develop appropriate steps to ensure that Black voices aren’t censored or harassed on the platform. It must also release the findings coupled with real, on-going solutions and actionable next steps.
That’s why we continue to try to break through to Facebook. Because when a bot or content moderator “gets it wrong,” people can die, whether that’s the choice to cut a live feed in the middle of a police altercation or to ignore domestic terrorists organizing violent confrontations in college towns across America. It is our hope that Facebook will do more than “look into it” and actually invest time and resources into getting it right.
Brandi Collins is the senior campaign director for media, democracy and economic justice at Color Of Change.