Twitch appears to be playing with a program that automatically rates streamers based on a handful of factors – including age, suspension history, and partnership status – in order to match them with advertisers. It’s called the Brand Safety Score, and it was discovered in Twitch’s internal API by a cybersecurity student. Daylam tayari, who posted footage of the changelog on Twitter.
A Twitch spokesperson stopped before confirming the existence of the brand’s safety score to Engadget, but made the following statement:
“We’re exploring ways to improve the Twitch experience for viewers and creators, including efforts to better match the right ads to the right communities. User privacy is essential on Twitch, and as we As we refine this process, we will not pursue plans that compromise this priority. Nothing has been launched yet, no personal information has been shared and we will keep our community informed of any updates along the way. “
Twitch has added an automatic brand safety score which assesses how friendly each streamer is based on things like chat behavior, ban history, manual reviews by Twitch staff, games played, age, self-modification and more (see below).
– Daylam ‘tayari’ Tayari (@tayariCS) March 9, 2021
According to Tayari, the brand’s safety score rates streamers based on their age (whether over 18 or 21), suspension history, relationship with Twitch, partnership status, the use of automod and at what level, if a stream is set to mature and the ESRB ratings of their games. There is also a section to add a manual review of a Twitch employee.
As described, the brand’s safety score resembles the ad rating systems already used by sites like YouTube and Twitter, or even ratings on ridesharing apps. This should help advertisers sort through the sea of banners and could affect Twitch Bounty Table, where advertisers offer specific concerts to a handful of selected partners and affiliates.
Knowing which metrics Twitch is tracking can help streamers stay at the top of the list, although there is no guarantee that the company will release any of its scoring algorithms to the public – unless a curious researcher does. another dive.