Personalised news: how to balance technology and editorial integrity

News

Without us really paying attention, personalization has become a normal part of our daily lives. We now have more access to content than ever before, but our online experiences are becoming more tailored and efficient. Whether it's recommended products, personalized music playlists, or suggested TV shows, personalization algorithms are everywhere.

With so many options to choose from, it can be both impressive and reassuring to see content that is specifically geared towards what we like. However, customizing the news, which is essential in today's digital world, brings up ethical and editorial issues.

Public interest reporting is crucial in a democracy as it keeps people informed, holds those in power accountable, and sparks important discussions. Unfortunately, these serious and complex stories can sometimes be overlooked in favor of lighter, more entertaining content.

This difference brings up an important issue: as users are exposed more and more to content that matches their current beliefs and interests, how can we make sure that significant, though less popular, stories are seen by a large audience? In a bigger sense: how do we find a middle ground between technological progress and maintaining honesty and trust with users in journalism?

I spent time researching different methods of personalization and studied the most recent academic papers on news recommendation systems in order to find answers to those questions. Additionally, I spoke with experts in technical and editorial personalization from various publishers around the world like Sweden, India, Canada, UK, and Australia, as well as leading figures in machine learning and academics to gain further insight on the subject.

Benefits Of Personalizing News

Blog posts are using advanced systems to suggest news to readers. These systems help get new content to the right people faster. This can help reach more people and give readers a better experience. The worry is that less popular but still important news might not get shown, or readers might only see things they are already interested in. Each company has their own way of personalizing news, but many use human input to make sure it's done right.

A lot of news organizations create a lot of news articles that people don't always see. NRS helps solve this problem by creating useful and important content that can attract more readers, keep subscribers interested, and connect with audiences better than a person could. By sharing this content directly on their websites instead of through other sources, news outlets can help ensure they stay financially stable, which is vital for journalism that serves the public good.

Different Methods Of Customization

During our research for this project, we discovered that different media companies have varying approaches to personalization. This is mainly influenced by how much they embrace automation, innovation, reputational risk, and computational resources. For example, Sky News UK doesn't personalize their homepage at all, while The Globe and Mail in Canada relies heavily on algorithms.

Sonali Verma, the person in charge of personalizing content at The Globe and Mail, emphasizes the significance of combining editorial selection with algorithmic suggestions. The first three featured articles on The Globe and Mail's website are carefully chosen by editors to showcase the most crucial news stories, demonstrating the newspaper's influence in shaping the country's priorities.

Verma explained to me that many news publishers create a large amount of content that the public isn't interested in and will never come across. By prioritizing important topics and making them easily accessible to people, you are not only benefitting your news organization but also helping to create a more informed society.

"Public Interest Values Integrated"

Finding the perfect mix between using algorithms to personalize content and relying on editorial judgement is essential for upholding the credibility and reach of journalism that serves the public interest. While algorithms can help suggest content based on user interactions, editors on the homepage are responsible for highlighting important stories that benefit the public.

However, algorithms can also support and safeguard journalism that serves the public interest, as long as they are created with values in mind rather than just focusing on user engagement numbers. Sveriges Radio (SR) in Sweden created a "Public Service Algorithm" that incorporates important values like reporting stories from the scene, including the voices of those affected, offering fresh perspectives, and presenting impartial viewpoints. This approach has been praised within their newsrooms and is viewed as a blueprint for incorporating AI into public service journalism across the field.

Machine Learning expert Michael A. Osborne, who is a professor at Oxford University, emphasized the importance of determining the "reward function" or "loss function" in machine learning models. He explained that solely focusing on metrics like click-through rates can lead to overlooking public interest stories in favor of more popular ones. In order to prioritize public interest stories, it is crucial to carefully consider the algorithm's objectives and involve users in aligning with those objectives.

A research project in 2023 conducted by Anna Schjøtt Hansen and Professor Jannie Møller Hartley pointed out the challenges of blending a personalization algorithm with conventional news principles. They found that editors and journalists at the news outlet had initial difficulties incorporating traditional news values into the algorithm, leading to concerns about losing editorial control.

I recently found out that the project being monitored in a Danish newsroom did not succeed. The team mentioned that the challenges they faced highlighted the importance of good communication and teamwork among data scientists, editorial staff, and IT teams.

Professor Hartley shared with me that there are two approaches that organizations take when facing resistance within the newsroom. Some organizations choose to keep the newsroom out of the loop until everything is ready to go, while others understand that involving the newsroom from the start and allowing them to be part of the design process can help overcome resistance.

If you want to prevent a similar outcome for your project, start by breaking down the barriers between different departments in your organization. Create and share a detailed plan, gather input from all areas of the organization, and make sure that the values of your tech and editorial teams are aligned. Be prepared to handle any technical issues that may arise, such as slow servers, tagging problems, or data errors.

While working at SR, I heard from some digital team members who feel restricted by the algorithm, viewing it as a barrier that hinders their ability to handpick news articles.

Some online content creators attempt to get around the limitations of algorithms by finding alternative methods. For example, they may prioritize certain stories or change the timing of their publishing to make sure specific stories receive more attention. This shows a tension between following the automated system and preserving editorial freedom.

Incorporating editorial discretion into the creation of more sophisticated customization algorithms has shown potential, as demonstrated by various media companies featured in this article. Examples such as SR's "Public Service Algorithm" and methods involving human input demonstrate how customization can be in line with journalistic principles. A review of news recommendation systems in 2024 emphasized the importance of prioritizing diversity, timeliness, privacy, and transparency, pointing towards ethical and responsible customization practices.

Ensuring the integrity of journalism that serves the public, while also tailoring news to individuals, is not only a matter of technical expertise but also one of ethics. It involves carefully blending technology with good editorial decision-making and strong values. If we succeed in this task, which is crucial for the future of public interest journalism, there are numerous benefits to be gained.

For a more in-depth exploration of these discoveries and understandings – including a glimpse into what personalization may look like in the future with generative AI, please download the complete document provided below.

Read more
Similar news