TikTok Chief Executive Shou Zi Chew did not have a successful appearance before the House Energy and Commerce Committee on Thursday.
He did not assuage skeptical members of Congress that his enormously popular social media platform can isolate itself from Chinese government interference. Nor did he convince them that TikTok has done enough to address misinformation, protect children from harmful material or remove content that violates its code of conduct. It didn’t help his cause when Republican Rep. Kat Cammack of Florida played a video showing an animated handgun firing in a post threatening the committee and its chairwoman. The video had been on TikTok for 41 days and was removed during the hearing.
Chew’s company was lambasted for more than five hours, a show of rare bipartisan consensus that something needs to be done about TikTok, but exactly what Congress or the Biden administration can or should do remains unclear.
It also became apparent that while TikTok is currently the target of federal inquiry, primarily because of growing anxiety with China’s power and influence, the concerns over user privacy, misinformation and impacts to children are not unique to TikTok.
Harmful practices are baked into the business models of social media platforms, including Instagram, Snapchat, Facebook and YouTube. An increasing number of state legislatures and lawsuits are attempting to force companies to take more responsibility for building safer products. Congress too should be wielding its regulatory authority more broadly to protect consumers, not just TikTok users.
Indeed, TikTok is similar to other social media apps that vacuum up personal data, wrote Ron Deibert, director of the Citizen Lab at the University of Toronto, which analyzed the TikTok app. He added that “most social media apps are unacceptably invasive by design, treat users as raw material for personal data surveillance, and fall short on transparency about their data sharing practices,” which is why comprehensive privacy legislation is needed.
Despite several years of debate, Congress hasn’t been able to move a bill that would protect data privacy on the internet. Lawmakers got close last year with the American Data Privacy and Protection Act, but there were questions over whether the bill would override California’s strong privacy law — which would be a mistake. House members said Thursday that they are trying again this year to pass the bill, which is good, but they should be catching up to California, not clawing back the state’s leading edge privacy protections.
The immediate question before federal lawmakers is how to address the national security concerns posed by TikTok’s ties to China. The app was created by Chinese internet technology company ByteDance. Federal agencies have raised alarm because Chinese law requires that tech companies allow government access to user data. There’s also concern that with the platform’s reach — it has 150 million users, or nearly half the U.S. population — and its powerful algorithm, TikTok could be used as a tool to disseminate propaganda or disinformation.
The Biden administration has threatened to ban TikTok unless the app’s Chinese owners sell their stakes.
Chew tried to make the case that TikTok is a private company independent of the Chinese government and could build a firewall to ensure there is no foreign interference. But his argument was undercut by an announcement Thursday from the Chinese Commerce Ministry that would oppose the forced sale. China considers technology a national security issue and has the right under Chinese law to block the export of it.
If a sale is off the table, the Biden administration has limited options. An outright ban would raise significant technical and legal issues, including whether cutting off an extremely well-used mode of speech would violate the 1st Amendment. What is the U.S. going to do after TikTok? Shut down every popular Chinese- or foreign-owned app?
Besides, simply banning TikTok doesn’t address the larger problem. Regulations and policies that protect Americans’ online privacy and limit the potential for harm to users, young and old, are long overdue.