When Mark Zuckerberg started Facebook in 2004, there was no way he could have known his little website would become the world’s most popular social network with nearly two billion registered users. He also could not have known his social networking site would be instrumental in one of the most heinous crimes committed in the past century.
The world was left shocked and dumbfounded last Friday when a self-described white supremacist, armed with an automatic machine gun, walked into two mosques in Christchurch, New Zealand, and opened fire on the crowd, killing 50 people and wounding dozens of others.
Among those killed was three-year-old Mucad Ibrahim.
The incident sent shock waves throughout the world. New Zealand is a country that, unlike the US, is not generally known for mass shootings and violent crimes.
“It is clear that this can now only be described as a terrorist attack,” New Zealand Prime Minister Jacinda Ardern said soon after the shooting.
What made the crime exponentially more horrific was that the perpetrator live streamed it on Facebook for the world to see in graphic, high-definition video. It is difficult to get a grasp on the depravity of this act.
Facebook reacted swiftly by banning the video, and even worked with competing social networks to quell its spread before it went viral.
They released a statement relating to the incident via their newsroom blog, expressing their horror and condolences. “Our hearts go out to the victims, their families and the community affected by the horrific terrorist attacks in Christchurch,” wrote Chris Sonderby, vice-president and deputy general counsel at Facebook.
The company has been working closely with the New Zealand police to support their investigation, and are providing an on-the-ground resource for law enforcement authorities.
In what is seen as an unprecedented move by the social network, they released detailed statistics about the video to the general public.
According to statistics, the video was uploaded by Facebook users 1.5 million times within 24 hours of the attack. Facebook’s artificial intelligence detection systems managed to block 1.2 million of those uploads, but that still left approximately 300 000 copies to slip through the cracks.
Facebook took additional measures to curb the video’s reach, such as designating the shootings as a terror attack and working with the Global Internet Forum to Counter Terrorism to report anyone praising or supporting the incident to the relevant law enforcement officials. Other platforms like YouTube were also flooded with uploads of the horrific video.
According to a representative from the streaming video service, at one stage copies of the video were being uploaded at a rate of one copy per second – an unprecedented rate for YouTube uploads.
The Washington Post reported that YouTube was going out of its way to block the video.
Despite this, people still managed to get copies through by re-editing and repackaging it to look like an innocent video.
Blocking the video on instant messaging platforms like WhatsApp was much more challenging, with people forwarding and broadcasting it to all their contacts.
On platforms such as Reddit and 8chan, where hate speech is common, the video went viral without much resistance.
While some people have lauded the likes of Facebook and YouTube for their willingness to block the spread of the video and to co-operate with law officials, others say they had no choice, because they faced major backlash and possible reputational risk because of the incident.
Facebook, in particular, has received a lot of negative publicity in recent times.
Either way, the one thing that comes into question is whether social networks are doing enough to prevent their platforms from being used to spread hate and violence.
No one denies the immensely powerful reach of social media and the ability of a simple message, image or video to go global within minutes. It is the information equivalent of a nuclear weapon.
Putting the incident into perspective, we can say that Facebook effectively gave a depraved, sick, twisted hate-mongering terrorist a potential live audience of two billion people to spread his hate. This may sound scary, because it really is.
Whereas many people hold the social networks responsible for the spread of the horrifying video, I differ. Social media gives us previously- unimaginable power to share positive content and to educate people.
After all, I was live on Facebook just a couple of weeks ago, talking to business people and students from around the world about the impact of artificial intelligence on business and careers.
If anyone is to blame for the spread of the video, it’s the people who, for some reason or other, felt that seeing 50 innocent human beings being mercilessly shot in their backs was entertaining or worthy of sharing with others. Social media was just the means.
Having said that, I believe that social networks have a responsibility to ensure their platforms are never used for evil, and should put all the necessary checks and balances into place before allowing people to post content, especially live video.
* Bilal Kathrada is an educational technologist, speaker, author, newspaper columnist and entrepreneur. He is the founder of CompuKids, a start-up that teaches children Computer Science skills. Bilal blogs at www.bilalkat.com.
** The views expressed here are not necessarily those of Independent Media.