<p>Just weeks after a broad effort announced by tech platforms to curb the spread of violent content, a video of Wednesday's deadly shooting in the German city of Halle was posted online where it may have been viewed by millions.</p>.<p>The gunman posted a video of the attack on the Twitch Livestream gaming platform owned by Amazon, the company acknowledged.</p>.<p>The video of the shooting at a synagogue and a Turkish restaurant included a "manifesto" with racist and anti-Semitic commentary.</p>.<p>"Twitch has a zero-tolerance policy against hateful conduct, and any act of violence is taken extremely seriously," a Twitch spokesperson said.</p>.<p>"We worked with urgency to remove this content and will permanently suspend any accounts found to be posting or reposting content of this abhorrent act."</p>.<p>The news comes after the deadly New Zealand mosque shooting live-streamed on Facebook in March, which prompted governments to press social networks to prevent the airing of violent acts on their platforms.</p>.<p>On September 23, Facebook announced additional efforts at the United Nations during a meeting with New Zealand's Prime Minister Jacinda Ardern, who has taken up the cause of fighting online extremism.</p>.<p>Also last month, Amazon announced it was joining the Global Internet Forum to Counter Terrorism, an alliance tasked with tackling the most dangerous content on social media.</p>.<p>The tech firms had been seeking to avoid a repeat of the handling of the bloodbath in Christchurch, where the assailant posted a manifesto online and then live-streamed his killing of 51 worshippers.</p>.<p>Twitch, which has gained a following for live-streaming gaming, was acquired in 2014 by Amazon for $970 million and has an estimated 15 million daily active users.</p>.<p>It was not immediately clear how long the video remained online, or how many people saw it. But segments of the video were reposted on Twitter and other social platforms.</p>.<p>After the Christchurch massacre, Facebook and others pointed out the challenges of preventing the sharing of violent content, often with minor changes to avoid detection by artificial intelligence.</p>.<p>Facebook also recently announced efforts to work with the police in London and elsewhere to get batter data on violence to improve its detection algorithms.</p>.<p>"Filtering algorithms so far have not been very good at detecting violence on Livestream," noted Jillian Peterson, a professor of criminology at Hamline University, who suggested that social media firms may end up being "held accountable" for their role in spreading violent and hateful content.</p>.<p>Research by Peterson and others suggest shooters may be affected by contagion when they see similar attacks.</p>.<p>"In many ways, these shootings are performances, meant for all of us to watch," Peterson said.</p>.<p>"Social media -- and now live streaming services --- have given perpetrators a larger stage and wider audience. Perpetrators are looking to show their grievance to the world, and live streaming gives them the means to do it."</p>.<p>Hans-Jakob Schindler of the Counter Extremism Project, a group seeking to curb online violence, said the latest live stream highlights a need for stronger actions against social platforms.</p>.<p>"Online platforms need to step up and stop their services being used and in turn, parent companies need to hold them accountable," Schindler said.</p>.<p>"Amazon is just as much to blame as Twitch for allowing this stream online. This tragic incident demonstrates one more time that a self-regulatory approach is not effective enough and sadly highlights the need for stronger regulation of the tech sector."</p>
<p>Just weeks after a broad effort announced by tech platforms to curb the spread of violent content, a video of Wednesday's deadly shooting in the German city of Halle was posted online where it may have been viewed by millions.</p>.<p>The gunman posted a video of the attack on the Twitch Livestream gaming platform owned by Amazon, the company acknowledged.</p>.<p>The video of the shooting at a synagogue and a Turkish restaurant included a "manifesto" with racist and anti-Semitic commentary.</p>.<p>"Twitch has a zero-tolerance policy against hateful conduct, and any act of violence is taken extremely seriously," a Twitch spokesperson said.</p>.<p>"We worked with urgency to remove this content and will permanently suspend any accounts found to be posting or reposting content of this abhorrent act."</p>.<p>The news comes after the deadly New Zealand mosque shooting live-streamed on Facebook in March, which prompted governments to press social networks to prevent the airing of violent acts on their platforms.</p>.<p>On September 23, Facebook announced additional efforts at the United Nations during a meeting with New Zealand's Prime Minister Jacinda Ardern, who has taken up the cause of fighting online extremism.</p>.<p>Also last month, Amazon announced it was joining the Global Internet Forum to Counter Terrorism, an alliance tasked with tackling the most dangerous content on social media.</p>.<p>The tech firms had been seeking to avoid a repeat of the handling of the bloodbath in Christchurch, where the assailant posted a manifesto online and then live-streamed his killing of 51 worshippers.</p>.<p>Twitch, which has gained a following for live-streaming gaming, was acquired in 2014 by Amazon for $970 million and has an estimated 15 million daily active users.</p>.<p>It was not immediately clear how long the video remained online, or how many people saw it. But segments of the video were reposted on Twitter and other social platforms.</p>.<p>After the Christchurch massacre, Facebook and others pointed out the challenges of preventing the sharing of violent content, often with minor changes to avoid detection by artificial intelligence.</p>.<p>Facebook also recently announced efforts to work with the police in London and elsewhere to get batter data on violence to improve its detection algorithms.</p>.<p>"Filtering algorithms so far have not been very good at detecting violence on Livestream," noted Jillian Peterson, a professor of criminology at Hamline University, who suggested that social media firms may end up being "held accountable" for their role in spreading violent and hateful content.</p>.<p>Research by Peterson and others suggest shooters may be affected by contagion when they see similar attacks.</p>.<p>"In many ways, these shootings are performances, meant for all of us to watch," Peterson said.</p>.<p>"Social media -- and now live streaming services --- have given perpetrators a larger stage and wider audience. Perpetrators are looking to show their grievance to the world, and live streaming gives them the means to do it."</p>.<p>Hans-Jakob Schindler of the Counter Extremism Project, a group seeking to curb online violence, said the latest live stream highlights a need for stronger actions against social platforms.</p>.<p>"Online platforms need to step up and stop their services being used and in turn, parent companies need to hold them accountable," Schindler said.</p>.<p>"Amazon is just as much to blame as Twitch for allowing this stream online. This tragic incident demonstrates one more time that a self-regulatory approach is not effective enough and sadly highlights the need for stronger regulation of the tech sector."</p>