U.S. TikTok users, once drawn to the app as a haven for free speech, are expressing growing concerns about censorship after the platform’s recent revival. The app, owned by China's ByteDance, was revived under an executive order from President Donald Trump following its temporary shutdown over national security concerns. However, users have noted significant changes in how content is moderated.
The shutdown stemmed from a new law, passed during the Biden administration with bipartisan support, requiring TikTok to be sold to a U.S. buyer. Trump, promising to address the ban, hinted at potential buyers, including allies with close ties to his administration. Despite TikTok's assurances that its policies and algorithms have remained unchanged, users have reported noticeable differences in the app's functionality and moderation practices.
Many users have observed increased content moderation, including limited search results, warnings about misinformation, and prompts encouraging users to verify their sources. Content creators report that posts and comments previously allowed are now flagged or removed. For instance, phrases like "Free Palestine" and "Free Luigi"—referencing controversial topics—have reportedly been flagged or struck from the platform. TikTok stated it does not allow content that promotes violent or hateful individuals.
Pat Loller, a comedian and veteran with 1.3 million followers, shared his experience of content being limited after he posted a satirical video about Elon Musk. The video, tagged as misinformation, was restricted to sharing in only one chat at a time. Loller expressed frustration, noting that such limitations were unprecedented in his experience with the platform.
Some users, like Lisa Cline, have faced repeated issues attempting to upload videos critical of Trump. Cline said her posts were rejected multiple times on TikTok before she successfully uploaded them to other platforms like Meta’s Threads. Similarly, political and social commentator Danisha Carter, whose account had 2 million followers, reported that her account was permanently suspended after the app's shutdown. Carter described the decision as politically targeted, noting her final livestream criticized tech executives’ influence over U.S. politics.
The increased moderation has also affected non-political content. Ada "Mila" Ortiz, a data analyst and content creator, reported receiving strikes for seemingly innocuous comments. Ortiz, concerned about being permanently banned, chose to delete 15 videos that supported Vice President Kamala Harris and criticized Trump. She described the actions as sudden and random, leaving her feeling targeted.
These changes have sparked fears among TikTok users that the platform's moderation practices are influenced by political and ideological factors. Many creators worry that the app, once known for its diverse and open environment, is becoming increasingly restrictive, targeting individuals based on identity or past content.
The controversy highlights the tension between TikTok’s efforts to comply with evolving regulations and users’ desire for free expression. While Trump has positioned himself as a defender of free speech with his executive order, critics argue that the platform’s new moderation policies raise questions about the limits of freedom on digital platforms.
|