Search

TikTok removing accounts of users who share QAnon-related content - The Verge

suitersa.blogspot.com

TikTok has been removing the accounts of users who share QAnon-related content on the platform, NPR reported, part of a policy the video-sharing platform says has been in effect since August. While TikTok initially focused on reducing discoverability of such content — banning QAnon-related hashtags, for instance — its policy now includes removing the content, banning accounts, and redirecting searches and hashtags to display its community guidelines.

“Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform,” a spokesperson said in an email to The Verge. The company has taken steps to make QAnon-related content harder to find in search. “We continually update our safeguards with misspellings and new phrases as we work to keep TikTok a safe and authentic place for our community.”

In July, TikTok started blocking several hashtags related to the QAnon conspiracy theory, including “QAnon,” “QAnonTruth,” and the related phrase “Out of Shadows.” But the videos themselves were still visible and could have appeared among the For You suggestions or in users’ feeds, the BBC reported.

Other social platforms have cracked down QAnon content as well. Facebook said earlier this month it would ban content related to QAnon, which it termed a “militarized social movement” from Facebook and Instagram. Facebook had removed some groups and pages promoting QAnon back in April, saying they were engaged in “coordinated inauthentic behavior.” Facebook users can still post QAnon content to their individual profiles, however.

Twitter banned thousands of QAnon-related accounts and links over the summer, and Reddit banned the QAnon subreddit r/GreatAwakening for violating its rules against “inciting violence, harassment, and the dissemination of personal information.” Even exercise platform Peloton has removed hashtags related to the conspiracy theory from its online classes.

And while YouTube isn’t banning QAnon content, the platform said last week it will remove “conspiracy theory content used to justify real-world violence.”

QAnon is a false conspiracy theory that claims, among other things, that President Trump is secretly planning to arrest high-profile Democratic politicians and celebrities for pedophilia or cannibalism. The FBI has labeled the conspiracy theory a potential domestic terrorist threat.

Let's block ads! (Why?)



"share" - Google News
October 19, 2020 at 08:52PM
https://ift.tt/35blh1s

TikTok removing accounts of users who share QAnon-related content - The Verge
"share" - Google News
https://ift.tt/2VXQsKd
https://ift.tt/3d2Wjnc

Bagikan Berita Ini

0 Response to "TikTok removing accounts of users who share QAnon-related content - The Verge"

Post a Comment


Powered by Blogger.