Bot that clicks ads on climate articles shows news is broken


An illustration of hands and the front page of the New York Times.

Drawing: Tega Brain and Sam Lavigne

Watching Synthetic Messenger is a somewhat dissociative experience. It works in a Zoom call with 100 participants, all of which are bots. Observers can watch these robots – which are eerily anthropomorphized with images of disembodied hands and voices that say repeatedly ‘scroll’ and ‘click’ – methodically scroll through climate change news articles and click on each ad on every page.

The project, created by two New York artist-engineers, kicked off earlier this month. In its first week and a half online, its bots visited 2 million climate articles – you can see them listed here– and clicked on 6 million ads.

If this all sounds like a weird, trippy art project, it sure is. But it’s also a critique of how narratives about the climate crisis are shaped by the media.

Most online outlets are funded by advertisers. Stories that get more ad clicks may also become more visible in Google’s search algorithms, drawing more attention to the page. When certain stories garner more views and engagement, news outlets are more likely to publish similar stories. Absurdly, this means that advertising mechanisms and algorithms can play a disproportionate role in determining what information people see rather than other factors like, uh, the importance of the story.

“With this project, we wanted to see how this media ecology affects our actual ecology, how storytelling affects our material realm,” said Sam Lavigne, artist and assistant professor in the Department of Design at the University of Texas.

Of course, conflicting narratives have always played a role in the climate crisis, as Lavigne was quick to note. Polluters know it is important to control the way people talk and think about the climate crisis. they have spent fortune on all kinds of disinformation campaigns, including the shaping of media stories.

“The narrative around climate change has been so controlled by the fossil fuel industry and lobby groups,” Lavigne said.

Algorithms have further distorted the way news – or, increasingly, misinformation – gets to people. YouTube’s algorithm for recommending videos, for example, has viewers encouraged to watch videos full of climate denial. YouTube also sold against these videos, profiting from misinformation while encouraging viewers to consume more and more.

As historically damaging wildfires spread across Australia a year and a half ago, a story has arisen that they were started by arsonists, not by the climate crisis. This disinformation, a group of researchers found, has spread with the use of in-line trolling robots. The conservative media then did an about-face and amplified these claims, creating a feedback loop where everyone debunked the lies rather than talking about how to deal with the climate crisis. (The same scenario played in the United States last year.) Yet, as Tega Brain, who co-created the project noted, these aren’t the only ways this algorithms have colored the media landscape.

“All news, and therefore all public opinion is shaped [by] algorithms, ”said Brain, an assistant professor of digital media at New York University with a background in environmental engineering. “And the algorithmic systems that shape the news are these black box algorithms,” she added, referring to the practice of tech companies of hiding from the public how their code and priorities are.

An alternative view of the carbon cycle.
GIF: Tega Brain and Sam Lavigne (Other)

Synthetic Messenger therefore seeks to play with the system by showing an interest fed by robots for stories on the climate. While it may play a small role in amplifying climate coverage, there are some complications. On the one hand, since its algorithm is imprecise and based on climate-related keywords, it also clicks on ads on climate-denying media. Its creators tried to get around this by blacklisting denier websites like those owned by Rupert Murdoch, but it’s not a perfect system.

If this project were primarily conceived as a tool of political organization, these could be big points of friction. But Brain and Lavigne are clear that they know their project will neither change the media landscape nor combat the climate crisis itself.

“We don’t intend to read it as ‘here is this new, really effective activist strategy to deal with climate change,'” Brain said. “Basically, with this project, we’re doing what’s called ‘click fraud’, and if we did it for long enough and on a large enough scale, it wouldn’t work, because ad networks obviously do all that. ‘they can for sort of protection against automated behavior. They would stop him.

Rather, the aim is to draw attention to the failed incentive structures that determine the climate stories told and amplified by advertisers and search algorithms.

“It’s not like we’re offering this as a solution to this problem that we have. The solution is meaningful climate policy, effective policy, ”Brain said. “But we’re trying to open a conversation and reveal how our media landscape is functioning right now.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *