Reddit posted their 2022 transparency report Wednesday, showing in part that the site pressed the ‘ban’ button hard after the company launched a crackdown on non-consensual intimate media (NCIM) after a the rules change in March of last year. These instances of NCIM include any kind of intimate image posted without a person’s consent or awareness, such as revenge porn, voyeurism, or an “accidental pinch moment”.
Reddit had previously banned any kind of porn posted “without permission”. In March 2022, the company changed the term “unintentional pornography” to “non-consensual intimate media” and marked it in the same policy category as posting someone else’s personal or confidential information.
After the change and once the banhammer started, the site saw a 473% increase in subreddit removals and a 244% increase in account bans for violating non-consensual content rules. Reddit said this reflected the NCIM policy change and “increased efficiency in detecting and removing” this content.
Meanwhile, just under 174,000 non-consensual intimate media posts and comments were removed from Reddit in 2022, compared to more than 187,000 instances of “unintentional pornography” in 2021. While less overall content was removed of the site, Reddit has been much more willing to completely remove sub-reddits and ban accounts. There were 368 subreddit removals for NCIM in 2021, compared to 2,109 in 2022. More than 54,000 accounts were permanently banned in 2022 for sharing non-consensual material compared to 15,847 in 2021.
In total, the site said it deleted 316.7 million content in 2022 compared to nearly 297 million in 2021. As for permabans, they are almost five times higher than last year, with most suspensions due to people creating new accounts to circumvent other bans. NCIM still remains a small part of the reason for the bans. For example, there were over 134,000 accounts permanently suspended for harassment and nearly 80,000 for spreading “hate content”. Most bans come from automod or Reddit bots, versus manually by site moderators.
Reddit had previously banned any kind of porn posted “without permission”. In March 2022, the company changed the term “unintentional pornography” to NCIM and marked it in the same policy category as posting someone else’s personal or confidential information.
Reddit has been already continued for allegedly allowing child sexual abuse material (CSAM) to sit dormant on the platform. The site’s transparency report says it uses automated tools as well as human reviewers to find CSAM on the platform. In 2022, Reddit said it removed 80,888 CSAM content items, most of those in the second half. In 2021, the National Center for Missing & Exploited Children reported it has received just over 10,000 reviews from CSAM on Reddit. The company said it filed a total of 52,592 reports with NCMEC in 2022.
Along with the new report, Reddit introduced the new Transparency Centerwhich includes past transparency reports as well as other Reddit policy information.
Reddit claimed to have received 51% more requests from governments and law enforcement to remove pages and posts compared to 2021, although almost 60% were requests to remove non-consensual pornography. . The Russian agency Roskomnadzor, in charge of the country’s media, tried to have Reddit delete several posts commenting on the Russian invasion of Ukraine. Reddit said that of the 42 pieces of content, it found 32 of the posts did not violate the site’s rules.
Meanwhile, India hit Reddit with nearly 50 takedown requests across 276 pieces of content or communities. Many of these requests were to both remove content and share user information. Reddit said it removed 92% of this content upon request. The site has also geo-restricted several sub-reddits containing non-NCIM pornography at the behest of India and Pakistan.