Germany shooting livestreamed despite efforts by tech firms

0
137

BERLIN: Just weeks after a broad effort announced by tech platforms to curb the spread of violent content, a video of Wednesday’s deadly shooting in the German city of Halle was posted online, where it was seen by some 2,200 people.

The gunman posted a video of the attack on the Twitch livestream gaming platform owned by Amazon, the company acknowledged.

The video of the shooting at a synagogue and a Turkish restaurant included a “manifesto” with racist and anti-Semitic commentary.

“Twitch has a zero-tolerance policy against hateful conduct, and any act of violence is taken extremely seriously,” a Twitch spokesperson said.

“We worked with urgency to remove this content and will permanently suspend any accounts found to be posting or reposting content of this abhorrent act.”

Nonetheless, the attack was streamed for 35 minutes and eventually seen by some 2,200 people — five of them while it was live — before the video was removed, the platform said on Twitter.

The news comes after the deadly New Zealand mosque shooting livestreamed on Facebook in March, which prompted governments to press social networks to prevent the airing of violent acts on their platforms.

On September 23, Facebook announced additional efforts at the United Nations during a meeting with New Zealand’s Prime Minister Jacinda Ardern, who has taken up the cause of fighting online extremism.

Also last month, Amazon announced it was joining the Global Internet Forum to Counter Terrorism, an alliance tasked with tackling the most dangerous content on social media.

The tech firms had been seeking to avoid a repeat of the handling of the bloodbath in Christchurch, where the assailant posted a manifesto online and then livestreamed his killing of 51 worshippers.

Twitch, which has gained a following for livestreaming gaming, was acquired in 2014 by Amazon for $970 million, and has an estimated 15 million daily active users.

It said the account used by the gunman was created “about two months prior to streaming the shooting” and had only been used to attempt to stream once before.

After the Christchurch massacre, Facebook and others pointed out the challenges of preventing the sharing of violent content, often with minor changes to avoid detection by artificial intelligence.

“This video was not surfaced in any recommendations or directories; instead, our investigation suggests that people were coordinating and sharing the video via other online messaging services,” Twitch said.

Facebook also recently announced efforts to work with police in London and elsewhere to get batter data on violence to improve its detection algorithms.

“Filtering algorithms so far have not been very good at detecting violence on livestream,” noted Jillian Peterson, a professor of criminology at Hamline University, who suggested that social media firms may end up being “held accountable” for their role in spreading violent and hateful content.

Research by Peterson and others suggest shooters may be affected by contagion when they see similar attacks.

“In many ways, these shootings are performances, meant for all of us to watch,” Peterson said.

“Social media — and now livestreaming services — have given perpetrators a larger stage and wider audience. Perpetrators are looking to show their grievance to the world, and livestreaming gives them the means to do it.”

Hans-Jakob Schindler of the Counter Extremism Project, a group seeking to curb online violence, said the latest livestream highlights a need for stronger actions against social platforms.

“Online platforms need to step up and stop their services being used and in turn, parent companies need to hold them accountable,” Schindler said.

“Amazon is just as much to blame as Twitch for allowing this stream online. This tragic incident demonstrates one more time that a self-regulatory approach is not effective enough and sadly highlights the need for stronger regulation of the tech sector.”