Rumble sends viewers tumbling into disinformation


“I’m not really expect things to be what they were, ”Sarah says. “There is no turning back.” Sarah’s mother is a QAnon believer who discovered the conspiracy theory on YouTube. Now that YouTube has taken steps to regulate disinformation and conspiracy theories, a new site, Rumble, has taken its place. Sarah feels that the platform has taken her mother away from her.

Rumble is “just the worst things you can do on YouTube amplified, like 100 percent,” Sarah says. (Her name has been changed to protect her identity.) Earlier this year, her mother asked for help accessing Rumble when her favorite conservative content creators (from Donald Trump Jr. to “Patriot Streetfighter”) flocked to it. from YouTube to the site. Sarah quickly became one of 150,000 members of the QAnon Casualties support group as her mother tumbled deeper into the rabbit hole of dangerous conspiracy theory.

Between September 2020 and January 2021, monthly site visits in Rumble went from 5 million to 135 million; in April, they were just over 81 million. Sarah’s mother is one of those new Rumble users and, according to Sarah, now refuses to be vaccinated against Covid-19. Explaining her decision, Sarah says, her mother cites the dangerous anti-vax misinformation found in many videos on Rumble.

Rumble says he doesn’t promote disinformation or conspiracy theories, but simply takes a free speech approach to regulation. However, our research reveals that Rumble not only allowed disinformation to flourish on his platform, but he also actively recommended it.

If you search Rumble for ‘vaccine’, you’re three times more likely to receive recommended videos that contain misinformation about the coronavirus than specific information. One video per user TommyBX starring Carrie Madej – a popular voice in the anti-vax world – alleges, “It’s not just a vaccine; we are connected to artificial intelligence. Others claim without merit that the vaccine is lethal and has not been properly tested.

Even if you search for an unrelated term, “law,” our research shows you’re just as likely to receive a wrong recommendation on Covid-19 as not – about half of the recommended content is misleading. If you search for “election,” you’re twice as likely to receive a disinformation recommendation as factual content.

Courtesy of Ellie House, Isabelle Stanley and Alice Wright; Created with Datawrapper

The data underlying these findings were gathered over five days in February 2021. Using an adaptation of a coded first developed by Guillaume Chaslot (a former Google employee who worked on the YouTube algorithm), information was collected on the videos recommended by Rumble for five neutral words: “democracy”, “election”, “law “,” Coronavirus “and” vaccine “. The code was run five times for each word, on different days at different times, so the data reflected Rumble’s consistent recommendation algorithm.

Over 6,000 recommendations were analyzed manually. There may be disagreements over what can and cannot qualify as misinformation, so this investigation erred on the side of caution. For example, if a content creator said, “I’m not going to take the vaccine because I think there is strength be a tracking chip, ”the video was not called misinformation. Whereas if a video said “there is a tracking device in the vaccine ”, that was. Our conclusions are conservative.

Of the five search terms used, Rumble is more likely than not to recommend videos that contain incorrect information for “vaccine,” “election” and “law.” Even for the other two words “democracy” and “coronavirus,” the likelihood of Rumble recommending deceptive videos remains high.

This data was tracked almost a year after the start of the pandemic, after more than 3 million deaths worldwide made it much harder to argue that the virus is fake. It is possible that the search for ‘coronavirus’ on Rumble resulted in much more misinformation early in the pandemic.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *