The CEO of TikTok, Shou Zi Chew, has ended his first public hearing before the United States Congress, where he sought to allay worries over the app’s links to the Chinese government and its purported inability to prevent “harmful” content.
Thursday’s meeting, which lasted longer than five hours, was held before the House Energy and Commerce Committee.
Both Democrats and Republicans on the committee expressed skepticism over the company’s independence from the Chinese government, highlighting the increasing bipartisan support for action against TikTok.
Chew portrayed the app, which has 150 million monthly users in the United States, as “a place where individuals can be creative and curious.” In addition, he asserted that the corporation is implementing measures that exceed industry requirements for data protection and transparency.
Both Republican committee chair Cathy McMorris Rodgers and Democratic ranking member Frank Pallone mentioned this problem as a source of national security worries.
Pallone referred to ByteDance, the Chinese business that controls TikTok, as a “Beijing communist parent company.”
Keep Reading
Chew has said on many occasions that ByteDance “is not owned or controlled by the Chinese government” and that he has seen “no proof” that the Chinese government has acquired or requested access to American customer data.
He said that TikTok “does not promote or censor anything at the government’s request.”
But, numerous senators seized on Chew’s comment that Chinese engineers may still have access to certain American data since the corporation relies on “global interoperability.”
Chew refuted claims that TikTok represented a threat to national security. “I believe that many of the concerns mentioned are hypothetical and speculative,” he stated.
Chew tried to assuage politicians’ fears over the security of US user data by presenting a program that will ensure “American data is held on American territory by an American corporation and supervised by American individuals.”
About larger social media concerns, legislators questioned TikTok’s capacity to filter disinformation, harmful messages, and content that is not age-appropriate. Some politicians exhibited TikTok films that pushed people to self-harm or commit suicide.
Chew answered that TikTok employs 40,000 reviewers and an algorithm to detect bad content and identify problematic content. In addition, the corporation would utilize “third-party validators” to evaluate its algorithms and will let researchers access to “analyze and monitor our content ecosystem.”