Document Analysis NLP IA
FREQ, RAKE or TFIDF
Summary (IA Generated)
The report breaks safety into multiple tiers, with the broadest level being Twitch‘s community guidelines, followed by site-wide moderation, then channel-specific moderation, then the small suite of safety tools that are given to viewers.
The role of channel mods and creators is emphasized in the site’s overall plan, and Twitch brags that 95% of channels now have either human moderators or AutoMod enabled–up from 93% in the first half of last year.
Twitch attruibutes much of the increase to changes it’s made in the last year, including the rollout of the ModView dashboard in March 2020, and a change in the second half of last year that enabled AutoMod by default for new channels without a human moderator assigned.
While these changes have made life a little easier for creators and their mods, it’s the Twitch-side moderation that most of the site’s users have questions about.
Twitch says it has increased its number of ‘content moderation professionals’ by four times in the last year, though it still doesn’t say how many are currently employed by Twitch–or specify whether they are in fact Twitch employees or contracted from an outside company as other social media platforms are known to do.
The report is just one of a number of measures Twitch instigated last year, after the streaming community complained of abusers using their platform to harass others.
It looks like Twitch will get a lot of it on this report, so we’ll have to wait and see what changes are made in time for 2021’s half-year report.