New Zealand mosque shootings: FB, Twitter face scrutiny after livestreaming of attack

Share

Facebook, Google and Twitter have had staff across time zones working around the clock in coordination with their various artificial intelligence platforms to track down and remove the video.

"Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video", Facebook said in a tweet.

But Jennifer Grygiel, a Syracuse University communications professor who follows social media, said the companies were doing far too little to prevent the spread of violent content. In the first 24 hours, we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload.

Facebook and YouTube did not immediately respond to HuffPost's request for comment on the matter.

"The technology is just not there yet to be able to identify a gun amongst all the different kinds of guns there could be in a video and also put the context around it to say 'Oh this is a news piece, a legitimate news piece versus an act of violence, '" she said. This would leave Google with more work to find out which videos are going to be flagged and which will remain.

Facebook, Twitter, Alphabet Inc and other social media companies have previously acknowledged the challenges they face policing content on their platforms. Keeping this in mind it is right now unclear how the Christchurch shooter was allowed to stream for a good 17 minutes before the footage was cut.

"This is a case where you're giving a platform for hate", he said.

According to authorities, a shooter appeared to livestream video of the attack on Facebook, documenting the attack on Facebook from the drive to the Al Noor Mosque from a first-person perspective, and it showed the shooter walking into the mosque from the auto and opening fire.

More news: GERMANY: Volkswagen boss apologises for Nazi gaffe
More news: SoftBank, Toyota in talks to invest $1B in Uber's self-driving unit
More news: Another Amazon device is getting Apple Music support in the US

The app is usually used to share videos of extreme sports and live music, but on Friday the footage recreated the carnage of a computer game, showing the attacker's first-person view as he drove to one mosque, entered it and began shooting randomly at people inside.

"We urge people to report all instances to us so our systems can block the video from being shared again".

"The responsibility for content of the stream lies completely and exclusively on the person who initiated the stream".

Britain's interior minister, Sajid Javid, also said the companies need to act. "That's unacceptable, it should have never happened, and it should have been taken down a lot more swiftly".

New Zealand's Department of Internal Affairs said people posting the video online risked breaking the law. It said it deleted the video thousands of times but it was still able to be found.

YouTube said: "Please know we are working vigilantly to remove any violent footage".

Members of a group called "watchpeopledie" on internet discussion board Reddit, for example, discussed how to share the footage even as the website took steps to limit its spread.

Prime Minister Jacinda Ardern alluded at a news conference to anti-immigrant sentiment as the possible motive, saying that while many people affected by the shootings may be migrants or refugees, "they have chosen to make New Zealand their home, and it is their home".

Share