Facebook said that it removed 1.5 million videos of footage from the shooting rampage at two mosques in Christchurch within 24 hours of the attack, underscoring the massive game of whack-a-mole social media giants have to play with even the most high-profile problematic content on their platforms.
In a statement, Mia Garlick, spokeswoman for Facebook New Zealand, said that the company continues to “work around the clock to remove violating content from our site, using a combination of technology and people.” Of the 1.5 million videos of the massacre, filmed by a body-worn camera on the perpetrator almost in the style of a video game, 1.2 million were blocked at upload.
Facebook’s statement came after New Zealand Prime Minister Jacinda Ardern said in a Sunday news conference that there were “further questions to be answered” by Facebook and other social media sites over their response to the events.
Ardern said that her country had done as much as it could to “remove or seek to have removed some of the footage” circulated in the aftermath of the attack, but that ultimate it has been “up to those platforms.”
When the horror began Friday morning in New Zealand, alleged shooter Brentan Tarrant’s Facebook followers were the first to know. He live-streamed his assault, from the time he started driving over to Al Noor Mosque to the moments when he fired his first shots.
Post Your Comments