Students may be approaching a point where they must wait until they’re off campus before viewing the latest trends on TikTok, as the popular social media app is increasingly being viewed as a dangerous tool for Americans.
While the University of Louisiana at Lafayette has yet to determine if it will restrict access to the app on its Wi-Fi network, many universities including Texas A&M University, Auburn University and the University of Oklahoma have removed the app from government-issued devices and/or advised students to remove the app from personal devices to protect their data.
Several states have also issued a policy to ban the downloading and use of TikTok on state and department-issued devices. The main concern among those wanting to restrict access to the app is ByteDance, TikTok’s parent company, being able to store data from the app’s users and offer that data to the Chinese government.
“The problem with ByteDance and TikTok is their corporation has to have links to the central Communist Party of China, and on the ByteDance board of directors there’s a Communist secretary, and I’ve seen how those guys work personally, up close,” Dr. William Davie said.
Before becoming the broadcasting sequence coordinator and a Board of Regents Support Fund (BORSF) endowed professor at UL Lafayette, Davie taught at 10 universities in China for a year and witnessed their government in person.
“Our greatest worry is, of course, misinformation and disinformation,” Davie said.
Davie claims an argument could be made that the disinformation campaign against recent presidential candidates was strong because the U.S. wasn’t fully aware of Russian and Chinese involvement on social media, whereas the country’s intelligentsia is more aware of it today.
For any executive order involving cybersecurity to have an effect on TikTok, Davie said the app must prove to be a threat through the imminent lawless action test.
“You can’t just say, ‘Oh, we think it is because we know those communists are fond of concealment,’ which they are, ‘and we know they’re fond of collecting data,’ which we know they are,” Davie said. “The court’s going to ask, when the First Amendment gets involved, in the private world, they’re going to say, ‘You’ve got to have true evidence of an imminent threat.’”
There are other reasons why China is being viewed as a potential imminent threat by U.S. officials. The U.S. recently shot down a giant Chinese balloon on Feb. 4 that was hovering across America from west to east, according to BBC. The piece of equipment was allegedly collecting sensitive U.S. information.
ByteDance also recently admitted to using TikTok to track IP addresses of journalists in an attempt to locate employees who might have been leaking confidential information, according to CNN.
Along with many other social media sites, TikTok could also be the target of the ongoing court case Gonzalez v. Google LLC, which questions whether or not interactive recommendation algorithms used on various platforms goes against Section 230(c)(1) of the Communications Decency Act.
According to the Bipartisan Policy Center, Gonzalez v. Google LLC stems from an ISIS terrorist attack in 2015. Reynaldo Gonzalez, father of Nohemi Gonzalez who was a victim of the 2015 attack, filed an action against Google, Twitter and Facebook claiming that their algorithms aided ISIS in recruiting new members and promoting threats and attacks.
Davie doesn’t feel that Section 230 of the Communications Decency Act should protect algorithms. Instead, Davie feels some restrictions should be put in place on platforms like TikTok.
“Do we have to rip apart Section 230? I don’t think so. We just have to build in some humanity on the algorithms,” Davie said. “Section 230 prevents [tech companies] from being held accountable.”
While students shouldn’t be extremely worried about using the app, Davie urges students to be cautious of misinformation on TikTok and other platforms. It is important that students practice good digital literacy.
We’ll have to wait and see what happens with TikTok in the near future.