Toronto Star Referrer

Mass attacks highlight big problems with Big Tech: livestreaming

Experts call for broader discussion around format, including whether it should exist at all

BARBARA ORTUTAY, HALELUYA HADERO AND MATT O’BRIEN

These days, mass shooters, like the one now held in the Buffalo, N.Y., supermarket attack, don’t stop with planning out their brutal attacks. They also create marketing plans while arranging to livestream their massacres on social platforms in hopes of fomenting more violence.

Sites like Twitter, Facebook and now the game-streaming platform Twitch have learned painful lessons from dealing with the violent videos that often accompany such shootings. But experts are calling for a broader discussion around livestreams, including whether they should exist at all, since once such videos go online, they’re almost impossible to erase completely.

The self-described white supremacist gunman who police say killed 10 people, all of them Black, at a Buffalo supermarket Saturday had mounted a GoPro camera to his helmet to stream his assault live on Twitch, the video game streaming platform used by another shooter in 2019 who killed two people at a synagogue in Halle, Germany.

He had previously outlined his plan in a detailed but rambling set of online diary entries that were apparently posted publicly ahead of the attack.

He decided against streaming on Facebook, as yet another mass shooter did when he killed 51 people at two mosques in Christchurch, New Zealand, three years ago.

By most accounts, the platforms responded more quickly to halt the spread of the Buffalo video than they did after the 2019 Christchurch shooting, said Megan Squire, a senior fellow and technology expert at the Southern Poverty Law Center.

Another Twitch user watching the live video likely flagged it to the attention of Twitch’s content moderators, she said, which would have helped Twitch pull down the stream less than two minutes after the first gunshots per a company spokesperson.

In 2019, the Christchurch shooting was streamed live on Facebook for 17 minutes and quickly spread to other platforms. This time, the platforms generally seemed to co-ordinate better, particularly by sharing digital “signatures” of the video used to detect and remove copies.

But platform algorithms can have a harder time identifying a copycat video if someone has edited it. That’s created problems, such as when some internet forums users remade the Buffalo video with twisted attempts at humour. Tech companies would have needed to use “more fancy algorithms” to detect those partial matches, Squire said.

Twitch has more than 2.5 million viewers at any given moment; roughly eight million content creators stream video on the platform each month, according to the company. The site uses a combination of user reports, algorithms and moderators to detect and remove any violence that occurs on the platform. The company said that it quickly removed the gunman’s stream, but hasn’t shared many details about what happened on Saturday.

A Twitch spokesperson said the company shared the livestream with the Global Internet Forum to Counter Terrorism, a non-profit group set up by tech companies to help others monitor their own platforms for rebroadcasts. But clips from the video still made their way to other platforms, including the site Streamable. A spokesperson for Hopin, the company that owns Streamable, said Monday it’s working to remove the videos and terminate the accounts of those who uploaded them.

Looking ahead, platforms may face future moderation complications from a Texas law — reinstated by an appellate court last week — that bans big social media companies from “censoring” users’ viewpoints. The shooter “had a very specific viewpoint” and the law is unclear enough to create a risk for platforms that moderate people like him, said Jeff Kosseff, an associate professor of cybersecurity law at the U.S. Naval Academy.

Alexa Koenig, executive director of the Human Rights Centre at the University of California, Berkeley, said there’s been a shift in how tech companies are responding to such events. In particular, Koenig said, co-ordination between the companies to create fingerprint repositories for extremist videos so they can’t be re-uploaded to other platforms “has been an incredibly important development.”

A Twitch spokesperson said the company will review how it responded to the gunman’s livestream.

Experts suggest that sites such as Twitch could exercise more control over who can livestream and when — for instance, by building in delays or whitelisting valid users while banning rules violators.

Another option, of course, would be to end livestreaming altogether. But that’s almost impossible to imagine given how much tech companies rely on livestreams to attract and keep users engaged in order to bring in money.

Free speech, Koenig said, is often the reason tech platforms give for allowing this form of technology — beyond the unspoken profit component. But that should be balanced “with rights to privacy and some of the other issues that arise in this instance,” Koenig said.

Online latform algorithms can have a harder time identifying a copycat video if someone has edited it.

BUSINESS | MARKETPLACE

en-ca

2022-05-18T07:00:00.0000000Z

2022-05-18T07:00:00.0000000Z

https://thestarepaper.pressreader.com/article/282136410016609

Toronto Star Newspapers Limited