TikTok Tightens Crackdown On QAnon, Will Ban Accounts That Promote Disinformation



Enlarge this image

TikTok says it is banning all accounts that share content related to the QAnon conspiracy theory, hardening its previous policy on the far-right movement.

Kiichiro Sato/AP




hide caption

toggle caption

Kiichiro Sato/AP


  • Transcript

  • Earlier this month, Media Matters identified more than a dozen hashtags TikTokkers used to spread QAnon conspiracy theories about President Trump’s positive coronavirus test, false beliefs about Democratic presidential candidate Joe Biden and videos questioning the reality of the pandemic.

    «We’re talking about hundreds of millions of video views just for a limited segment of QAnon communities that we identified,» Carusone said.

    TikTok, which has 100 million monthly active users in the U.S., made its expanded ban against QAnon quietly in a statement to Media Matters, where it garnered little attention. A TikTok spokesperson confirmed the policy to NPR on Saturday.

    Hany Farid, a UC Berkeley computer science professor who is a member of TikTok’s committee of outside content moderation experts, said there is tension within social networks over how to respond to misinformation without also amplifying the underlying theories.


    Technology
    TikTok Sues Trump To Block U.S. Ban

    «When you ban it, you give it credibility. You give it attention,» Farid told NPR.

    «But the movement got big enough and dangerous enough that people were looking at the landscape and saying, ‘Yeah, this is completely out of control,’ » he said. «Were they slow to do it? Probably. But platforms get criticized when they act too quickly. So there is a dilemma there.»

    TikTok uses a mix of artificial intelligence and thousands of human content moderators to try to curb troubling content. The Chinese-owned app is best-known for viral dance challenges and comedic performances.


    Editors’ Picks
    TikTok Sensation: Meet The Idaho Potato Worker Who Sent Fleetwood Mac Sales Soaring

    According to TikTok’s Community Guidelines, misinformation that «causes harm to individuals, our community or the larger public» is prohibited on the site, including medical misinformation, which QAnon has engaged in by pushing false notions about the deadly coronavirus.

    Carusone of Media Matters said misinformation accounts on TikTok have been clever about avoiding detection by hijacking otherwise-benign hashtags, or creating new hashtags that are written slightly in code, among other strategies to evade efforts to curb the content.

    «The test of this policy will be how much it affects the creation and germination of new QAnon content on TikTok,» Carusone said. «If you know your video is going to be eliminated before it has a chance to spread, you’re less likely to spend time polluting the TikTok pool.»

    The future of TikTok in the U.S. remains uncertain. A federal judge last month temporarily halted a Trump administration attempt to shut down the app. But a separate order from the White House for TikTok to divest from its Beijing owner or cease operations remains in place, with a deadline of Nov. 12 for TikTok to find an American buyer or close down its U.S. operations.

    Trump officials cite national security concerns with TikTok’s China-based corporate owner, ByteDance, but TikTok has long dismissed the effort as an crusade to score political points. The company says U.S. user data is controlled by an American-led team and that the Chinese government has never requested access to the data.

    • qanon
    • TikTok



    Комментарии 0

    Оставить комментарий