Opinion: It’s time to hold social media platforms to account: We don’t need another TikTok video advocating or suggesting school violence – Columns

Opinion: It’s time to hold social media platforms to account: We don’t need another TikTok video advocating or suggesting school violence – Columns

Opinion: It's time to hold social media platforms to account

TikTok: A place where we can watch cooking videos, ridiculous dance moves, and apparently where people can post unreliable threats of school violence. Last month, dozens of school districts across the country announced their closures due to a flurry of anonymous videos on TikTok that mention bullet and bomb threats.

This is extremely dangerous for our children, schools, parents, teachers and society. It is more time than ever to hold social media companies like TikTok accountable for the content posted on their sites.

TikTok has become one of the most popular social media platforms of this generation. It was launched in 2016 by Chinese technology company ByteDance and allows users to create, watch and share short videos. As of late 2021, the app had nearly one billion active users. Yes, as 1 in 7 people in the world uses TikTok. That’s a lot of people.

Since this platform is so huge, it can be difficult to organize what is being posted. Currently, TikTok has a US-based security team and any content uploaded goes through a device that checks for policy violations. This is then reviewed by a person, who is part of the app’s security team, before it is published. Recently, TikTok used a program that can automatically remove any videos that might violate its guidelines.

This program is clearly insufficient.

In mid-December, school districts from Texas to Michigan issued warnings, canceled classes, and increased the security presence due to the spread of a TikTok video warning of an impending bombing or shooting. Even the Austin ISD Police Department ramped up security and monitored the national social media trend in late December, in an effort to prevent potential damage. It is clear that threats of school violence have passed through safety programs, and this is not the first time.

TikTok and other social media giants like Facebook and Twitter have come under fire for posting harmful content among children and young adults. In 2021, teachers had to ask TikTok to step in when the “teacher slap” challenge went viral. Two years ago, 4,000 people watched a live broadcast of the mass killings posted on Facebook, which quickly spread across the Internet and was reposted countless times.

In response to these dangerous and contradictory pieces of content, the social media giants claim that they continue to strengthen their security measures. I’m not sure.

A TikTok spokesperson responded to the alleged threats of school violence, tweeting, “…we are working with law enforcement to consider warnings about potential violence in schools although we have not found evidence of such threats originating or spreading via TikTok.”

These are empty and broken promises. As the spread of dangerous content continues to make its way through the so-called safety measures in place on the social media giant, we will continue to see issues such as school threats emerging. We will continue to see hate speech, calls to incite violence, and other toxic content that affects our children and puts our communities at risk.

As such, it is time to hold the huge social media companies to account.

Currently, under Section 230 of the Communications Decency Act, platforms such as TikTok, Twitter, and Facebook are not treated as publishers and are not technically responsible for the content posted by users. The law was created in 1996 and is designed to protect websites from a lawsuit if a user posts something illegal. Joe Biden has suggested repealing Article 230 entirely, which would be a good start. The administration could act by removing legal immunity from lawsuits for social media giants, especially those who refuse to be proactive in removing dangerous content.

Enough is enough. It’s time to legislate that social media networks can be held liable for harm caused by false information, harmful content, and incitement to violence shared on their platforms.

We don’t need another video that advocates or refers to school violence. The clock is ticking, TikTok.

Annika Olson is the Associate Director of Policy Research at the Institute for Urban Policy Research and Analysis. Anika is passionate about using research and legislative analysis to inform policies that affect the lives of vulnerable individuals in our society. She holds a dual MA in Psychology and Public Policy from Georgetown University and a BA in Psychology from UMass Amherst Honorary Commonwealth College. Annika previously worked as a member of AmeriCorps with at-risk youth in rural New Mexico and Austin.


Leave a Comment

Your email address will not be published. Required fields are marked *