Crixeo: Facebook Live

 

Facebook Live: Turn It On for Activism, But It Might Turn On You

“Stay with me,” Diamond Reynolds says at the beginning of her Facebook Live Stream, filmed on the evening of Wednesday, July 6, 2016, in Falcon Heights, Minnesota, from the passenger seat of the car. Her four-year-old daughter is in the backseat crying, while her boyfriend, Philando Castille, struggles to breathe in the driver’s seat beside her. There’s blood on his arm and soaking into his white T-shirt as he slowly bleeds to death. A police officer is standing outside the vehicle, still pointing his gun through the open window.

Diamond describes the scene as calmly as she can: “We got pulled over for a busted tail light in the back and the police…he killed my boyfriend.” The officer screams, apparently panicked and unnerved by what’s just taken place, “Keep your hands where I can see them! F*ck!” Reynolds remains calm as she tells the police officer, who’s still brandishing his weapon, “You shot four bullets into him, sir. He was just getting his license and registration, sir.”

Philando, a 32-year-old school cafeteria supervisor with no serious criminal record, died that evening from his injuries, while Diamond and her child were taken into police custody. At some point, her phone was confiscated and the video was taken offline. However, her friends and followers, who’d already witnessed the 10-minute live stream, quickly cried foul, compelling Facebook to restore the footage. They promptly blamed the censorship on a technical “glitch.”

By Thursday morning, the shooting was a major story. By noon, the live stream had been viewed more than 3.2 million times on Diamond’s Facebook profile alone. The video was raw, unfiltered and, more importantly, provided evidence before police could form their testimony. And in a world where police body cameras are seldom turned on, worn or even allowed to capture footage that can be used in court, Diamond Reynolds had decided to take matters into her own hands. By bringing the world into one of the worst experiences of her life, Diamond had changed the game — and Facebook Live.

Facebook Live was never intended to become a tool for people to document their experiences with police brutality or violent shootings. It was simply Mark Zuckerberg’s strategy for competing with traditional television, Snapchat and Periscope. Moreover, it was a way to stay relevant in an online world that is rapidly evolving from text-and-photo-driven content to live streams and video.

And with 1.7 billion users, Facebook couldn’t afford to lose their audience to other video services. Unveiled in the spring of 2016, Facebook Live offered a way to compete and generate content easily that was “designed to be unedited, unpredictable, and exist outside of a curated, career-friendly profile.” Zuckerberg even went so far as to say that “Live is like having a TV camera in your pocket. Anyone with a phone now has the power to broadcast to anyone in the world. When you interact live, you feel more connected in a more personal way.”

Yet providing an unfiltered window into the lives of others was also opening a Pandora’s box of experiences to the public. Similar to the internet itself, many assumed that Facebook Live was another tool for diverse and marginalized voices to exercise their freedom of expression and be heard by a larger audience. And since the social media platform has always been cheap, easy and immediate, it was quickly reshaping our idea of what a public space could be and what stories we could tell.

Of course, Zuckerberg had envisioned Facebook Live for sharing recipes or baby’s first steps, not violence or murder; but the world offline was just as active as online content consumption. And in the hands of those suffering injustice in the “real world,” Facebook Live was effectively exposing many online users to a world they had never seen before. But would Facebook allow this particular window to remain open and unedited?

The following morning, Diamond Reynolds cried out to the crowd: “The police did this to me. They took an innocent man away from us. They did this to him, and they did this to me. I want justice and I want peace!” It was during this impromptu press conference when reporters were able to ask, “Why did you decide to use Facebook Live?” She told the horde of journalists, “I wanted to put it on Facebook to go viral so that the people could see. I wanted the people to determine who was right and who was wrong. I wanted the people to be the testimony here.”

This same concept may have been going through the mind of 23-year-old Korryn Gaines on August 1, 2016, in Randallstown, Maryland. She’d already used Instagram to film a traffic stop gone awry in March, but because she’d failed to appear in court for the traffic violation, a warrant had been issued for her arrest. Three police officers were now knocking at her door at 9:02 a.m.

Korryn’s father was a police dispatcher; her mother, a registered nurse. She was a hairdresser and mother of two with no major criminal record, who may have also been developmentally disabled from excessive lead intake. Nevertheless, when the police entered her apartment, Korryn was holding a shotgun and cradling her five-year-old son. What followed were hours of police negotiations as Korryn locked herself and her child into a closet and began to live stream events on Facebook Live and Instagram — two social media platforms owned by Zuckerberg’s organization. Police believed this was interfering with negotiations and requested that Facebook shut the live stream down. The company quickly complied and deactivated her accounts.

According to police testimony, around 3 p.m., Korryn pointed her legally purchased Mossberg 12-gauge pistol grip shotgun toward an officer in her home and said, “If you don’t leave, I’m going to kill you.” The officer opened fire. Korryn managed to get two shots out herself but was soon killed, while her son suffered non-life-threatening injuries that required hospitalization. None of this was able to reach her audience on Facebook or Instagram, and the majority of her videos were taken into evidence.

This was a frightening prospect to social justice activists, such as Black Lives Matter, which began as a social media campaign to bring attention to police brutality and unjust legislation. Law enforcement officials do not have the legal authority to deactivate social media accounts. Instead, they are dependent on the social media companies to comply with requests. And since Facebook itself is neither a part of the US government nor a public utility, they are under no obligations to the First Amendment. In other words, they can block or deactivate any content or accounts they choose.

For those who had turned to social media activism as a way to make their stories heard or to document their experiences with police brutality, this new development in their relationship with Facebook and similar platforms was unnerving. After all, what if Facebook had chosen to deactivate Diamond Reynolds’ live stream from her car? Could Facebook be trusted as a voice for the voiceless?

The issue with social media activism is threefold. First, Facebook is not in the social justice business. As Zuckerberg himself has said, in relation to accusations concerning inflated video metrics over the past two years, “Our focus has always been on driving business results for our clients.” And with nearly 97% of the company’s revenue being comprised of advertising dollars, it’s no surprise who those clients are.

In fact, more than 300 million consumers see dynamic ads for over 2.5 billion unique products on Facebook each and every month. During the first half of 2016, Facebook pulled in $11.818 billion in total revenue (a 55.8% increase from the same period in 2015). Of that sum, $11.404 billion was made up in advertising revenue alone. All in all, the net income for the social media platform from the beginning of 2016 was some $3.565 billion — up 189.6% from $1.231 billion in 2015.

In order to remain profitable, Facebook must ensure that each user is provided only with content that they are the most likely to click, share and like. This plays on each user’s preexisting biases, assumptions and even political affiliations. In essence, it creates something called confirmation bias, in which users are encouraged to embrace only information that affirms their preexisting beliefs, while ignoring information that does not.

While this model may serve businesses, it creates an echo chamber that drives behavior in a negative way. For instance, according to Pew Research, in 2014 “consistent conservatives” on Facebook were twice as likely as the average user to say that posts in their feeds were “mostly or always” in line with their own views. 40% of “consistent liberals,” on the other hand, were likely to block or unfriend someone who didn’t share their views. And with 61% of millennials getting their news from Facebook, it’s unlikely that diverse ideas are actually reaching a broader audience or inspiring others to act — no matter how sincere the effort.

Secondly, Facebook is not a friend of the activist. In fact, the controversial figurehead of Wikileaks, Julian Assange, has called the platform “the most appalling spy machine that has ever been invented.” He points out that “it’s the world’s most comprehensive database about people, their relationships, names, addresses, locations and communications with each other, all sitting within the United States and all accessible to US intelligence.”

In a 2011 interview with RT, Assange was also quick to point out that Facebook and Twitter did not play a valuable role in the Egyptian Revolution and Arab Spring, as many have assumed. He explained that those organizing protests had produced a very important manual which had been the primary tool on the streets during the revolution. On both the first and last pages it had explicitly stated, “Do not use Facebook or Twitter.”

“And why did it say that?” Assange posits. “Because they’d had past experiences that, when there were smaller demonstrations in Egypt that used Facebook and Twitter, everyone was rounded up by the intelligence services.”

Thirdly, there is the issue of the hashtag. On the one hand, it can be a valuable tool for bringing people together around common ideas; on the other, it enables predators to troll, harass, dox or threaten individuals online because of their views. Even worse, hashtags imply a sense of accomplishment, adding a perceived immediacy while promising a false sense of effectiveness. It can feel empowering to contribute online, as if we’re connecting to some sort of global power cord that will inspire change; yet, #PhilandoCastille trended on social media, as did #SayHerName for Korryn Gaines — and little actually changed because of it.

As Shonda Rhimes pointed out in her 2014 Commencement Speech at Dartmouth College: “A hashtag is not helping… Hashtags are very pretty on Twitter…but a hashtag is not a movement. A hashtag does not make you Dr. King. A hashtag does not change anything. It’s a hashtag.” But what a hashtag is very effective at doing is keeping you coming back for more, generating content and exposing yourself to more advertising.

In other words, hashtags have become cultural capital — commodities — in a larger profit-making scheme. And the more powerful these organizations become, the less they’ll cater to the grassroots efforts that must take place for real change to occur, requiring pay-to-play simply to have your voice heard at all. And for what?

Between the death of Philando Castille on July 6 and the end of September 2016, there have been 257 other individuals killed by police officers in the United States. That number will surely increase by December 31. Undoubtedly, there will be more hashtags, more videos on Facebook, more online outrage — but it will take more than content to create lasting change. It will require massive public protest in the real world, away from social media. Activists have to turn off their phones and hit the streets the old-fashioned way; otherwise, we’re just screaming mindlessly into the void — and at any point, Facebook may very well turn that channel off.