1. Algorithms used in online news consumption raise normative questions about the role of machines as news gatekeepers.
2. More than half of news users worldwide prefer to access news through search engines, social media, or news aggregators that rely on algorithms to select and rank stories.
3. Algorithms can be designed to expose citizens to more diverse content and politically cross-cutting information while maintaining audience engagement and monetizing long-tail content.
The article "Google News and Machine Gatekeepers: Algorithmic Personalisation and News Diversity in Online News Search" raises important questions about the role of algorithms in news consumption. The authors argue that while algorithms can bring efficiency, consistency, speed, and scale to the distribution and organization of news, they also raise fundamental normative questions about the role of machines as news gatekeepers.
The article highlights the growing influence of algorithms in selecting and distributing news. More than half of news users worldwide prefer to access news through search engines, social media, or news aggregators that rely on algorithms rather than editors to select and rank stories. Google News has the largest share of the news aggregation market internationally.
The authors argue that algorithmic personalization can systematically filter counter-attitudinal news and information akin to “invisible auto-propaganda indoctrinating us with our own ideas.” They suggest that this process can lead to filter bubbles and echo chambers that have potentially deleterious consequences for democracy by weakening viewpoint diversity, undermining group deliberation and public debate, and contributing to political polarization.
However, the authors also acknowledge that algorithms can be designed to expose citizens to more diverse content and politically cross-cutting information while maintaining audience engagement and monetizing long-tail content. There is a growing body of evidence that algorithms can equally engender forms of political heterogeneity and diversity as they do political homogeneity and uniformity.
While the article provides valuable insights into the potential risks associated with algorithmic personalization in online news search, it does not explore counterarguments or present both sides equally. The authors seem biased towards a negative view of algorithmic personalization without fully considering its potential benefits or acknowledging evidence suggesting it can promote diversity.
Additionally, some claims made in the article are unsupported or lack evidence. For example, the authors suggest that filter bubbles occur when people with similar interests or political worldviews interact primarily within their group. However, research suggests that people often seek out diverse viewpoints even when using personalized recommendations.
Overall, while this article raises important questions about algorithmic personalization in online news search, it could benefit from a more balanced approach that considers both potential risks and benefits. Additionally, claims should be supported by evidence where possible to avoid one-sided reporting.