35.3 F
New York
Thursday, December 3, 2020

-

Home News TikTok’s QAnon ban has been ‘buggy’ – TechCrunch

TikTok’s QAnon ban has been ‘buggy’ – TechCrunch

Document Analysis NLP IA

1089
WORDS

WORDS
5:26
Reading Time

Reading Time
positive
sentiment

Sentiment0.11741481984019
objective
redaction

Subjectivity0.40457183778079
probably it's an affirmation
Affirmation0.47619047619048

Highlights

RELEVANT
FREQ, RAKE or TFIDF
Entity
ORG
Entity
PERSON
Entity
PRODUCT
Entity
OTHER
Key Concepts (and relevance score)

Summary (IA Generated)

TikTok has been cracking down on QAnon-related content, in line with similar moves by other major social media companies, including Facebook and YouTube, which focus on reducing the spread the baseless conspiracy theory across their respective platforms.

According to a report by NPR this weekend, TikTok had quietly banned several hashtags associated with the QAnon conspiracy, and says it will also delete the accounts of users who promote QAnon content.

TikTok had initially focused on reducing discoverability as an immediate step by blocking search results while it investigated, with help from partners, how such content manifested on its platform.

In August, TikTok also set a policy to remove content and ban accounts, we’re told.

Despite the policies, a report this month by Media Matters documented that TikTok was still hosting at least 14 QAnon-affiliated hashtags with over 488 million collective views.

Today, a number of QAnon-related hashtags — like #QAnon, #TheStormIsComing, #Trump2Q2Q” and othersreturn no results in TikTok’s search engine.

Instead of just showing users a blank page when these terms are searched, TikTok displays a message that explains how some phrases can be associated with behavior or content that violates TikTok’s Community Guidelines, and offers a link to that resource.

While TikTok’s ban did tackle many of the top searches results and tags associated with the conspiracy, we found it was overlooking others, like Pizzagate and WWG1WGA, for instance.

In tests this afternoon, these terms and many others still returned much content.

We had reached out to TikTok today to ask why searches for terms like “pizzagate” and “WWG1WGA” — popular QAnon terms — were still returning search results, even though their hashtags were banned.

For example, if you just searched for “pizzagate,” TikTok offered a long list of videos to scroll through, though you couldn’t go directly to its hashtag.

This was not the case for the other banned hashtags (like #QAnon) at the time of our tests.

Other QAnon-associated hashtags were also not subject to a full ban, including WWG1WGA, WGA, ThesePeopleAreSick, cannibalclub, hollyweird, and many others often used to circulate QAnon conspiracies.

When we searched these terms, we found more long lists of QAnon-related videos to scroll through.

“Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform.

We’ve also taken significant steps to make this content harder to find across search and hashtags by redirecting associated terms to our Community Guidelines.

TikTok said also that the search term blocking must have been a bug, because it’s now working properly.

We asked Media Matters whether it could still praise TikTok’s actions to ban QAnon content, given what, at the time, had appeared to be a loophole in the QAnon ban.

Though a search for their username won’t return results now that the ban is no longer “buggy,” you can still go directly to these users’ profile pages via their profile URL on the web.

We tried this on many profiles who had published QAnon content or used banned terms in their videos’ hashtags and descriptions .

These examples of “bugs” or just oversights indicate how difficult it is to enforce content bans across social media platforms.


131FansLike
3FollowersFollow
16FollowersFollow