As Hao writes, a New York University study Facebook pages from partisan editors found that “those who regularly published political disinformation were most engaged in the run-up to the 2020 US presidential election and the Capitol riots.
Zuckerberg, after saying that “a bunch of inaccurate things” about Facebook’s incentives to allow and amplify misinformation and content polarization were shared during the hearing by members of Congress, added:
“People don’t want to see misinformation or divisive content about our services. People don’t want to see bait and things like that. While it’s true that people are more likely to click on it in the short term, it’s not good for our business, product, or community that it’s there. ”
His response is a common topic of discussion on Facebook and ignores the fact that the company has not undertaken a centralized, coordinated effort to examine and reduce how its recommendation systems amplify misinformation. To learn more, read Hao’s report.
Zuckerberg’s comments came during the House Committee on Energy and Trade’s hearing on disinformation, where members of Congress interviewed Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey about the spread of disinformation about the November US election Capitol Building and anti-covid vaccines, among others.
As has become common at these hearings, Tory lawmakers have also asked CEOs about the perception of anti-Tory bias on their platforms, a long-standing claim by the right-wing that data does not support.