Early next week, executives from Facebook, Twitter and YouTube will participate in a Senate Judiciary Hearing on Algorithmic Amplification (via Politico). the April 27 hearing will feature the testimony of Monika Bickert, vice president of content policy at Facebook; Lauren Culbertson, head of US public policy on Twitter; and Alexandra Veitch, Public Policy Manager for the Americas on YouTube. The panel will also call on two experts in Tristan harris, a former Google design ethicist who has since become a critic of the tech industry, and Joan donovan, research director of the Shorenstein Center on Media.
By appealing to the politicians, instead of the CEOs of each company, the Privacy, Technology and the Law subcommittee attempts a different approach than previous high profile Senate hearings on Big Tech. A group of congressional assistants said Politico a future panel could involve Mark Zuckerberg, Jack Dorsey and Susan Wojcicki. However, the goal of next week’s event is to focus on major structural issues. “We’re doing this in part because we want it not to be so much like a grievance session where people just complain about platforms to CEOs,” one of the post assistants said.
How recommendation algorithms might fuel extremism and disinformation has been a question Democratic lawmakers have been pondering for some time. In January, representatives Tom Malinowski (D-NJ) and Anna G. Eshoo (D-CA) sent a series of separate letters to the CEOs of Facebook, Twitter and YouTube, inviting them to make substantial changes to these systems. But some experts worry that lawmakers may be missing out on some of the bigger issues by focusing only on recommendation algorithms.
In one Medium post she posted two days after the Jan.6 Capitol attack, Stanford doctoral candidate Becca Lewis argued that all of YouTube, not just its recommendation algorithm, is a vehicle for spread far-right propaganda. Yes, software does play a role, but it’s only one factor that amplifies these ideologies. Another vital facet of how the platform can radicalize people is how YouTube fosters one-sided relationships between fans and content creators. And this is something lawmakers might miss in their search for an easy fix to extremism and disinformation.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.