STATEMENT ON TRANSPARENCY FOR NEWS AUTOMATION AND PERSONALISATION 2019
Council for Mass Media
The use of algorithms and the ethics of artificial intelligence have been the subject of lively public debate. Demands have been made that those who use algorithms must be open and clear with the information they collect about their users and which principles they use when highlighting content for their audience.
Algorithms are also used by media organisations that fall within the regulation of the Council for Media. Examples include news robots, election compasses and article recommendations for readers. New technological innovations provide opportunities for better and more.effective journalism.
The purpose of this statement is to define the use of algorithmic tools as part of journalistic work and to assure the public that the media act openly and responsibly when using algorithms.
Scope of the statement
It is not expedient to describe in detail in a single statement all the different ways in which algorithms are used. This statement specifically addresses how the media should inform their audience when using:
1. News automation – for example, so-called news robots and other algorithmic tools that automatically generate and publish journalistic content such as text or infographics.
2. Personalisation, that is content that is targeted in different ways to different people – for example, so that the front page of a website, in a mobile application or any other part of a service, such as article recommendations is adapted based on the user’s past behavior.
Responsibility for journalistic decision-making should continue to be held by editors
The Council notes that the use of news automation and content personalisation is always subject to journalistic judgements. This includes choosing what to publish and for whom and what weight a news item should be given. This is at the heart of journalistic decision-making.
The Code of Journalism (approved by The Council) presupposes that decisions about published content are made on journalistic grounds and that journalistic decision-making is not left to anyone outside the editorial office. Responsibility for decision-making should therefore not be transferred to third parties who create algorithms. Responsibility for how the algorithms affect journalistic content always lies with the editors, ultimately the editor-in-chief.
The Council notes that the media should have sufficient insight into how the algorithmic tools they use affect content. For example, if a media organisation buys a tool developed externally, the editorial staff should identify and approve the most important functional principles and be able to respond to problematic situations if necessary.
The Council recalls that the Code of Journalism applies to all journalistic work. The mass media should thus ensure that people who develop their digital services also comply with the Code when making independent decisions that affect journalistic content.
The public has the right to know about news automation and personalisation
In order for journalism to be trustworthy, it is important that the public feels that the actions of the mass media uphold high standards of transparency. The Council notes that the public has the right to know whether journalistic content is recommended and addressed in different ways to different people based on user data. The Code on Journalism also presupposes that information is provided in an open way and that the source is stated when information published by others is used. Therefore, the Council makes the following recommendations.
Recommendation on transparency about when personalisation is used
Mass media that in different ways adapt content for different users should openly inform the audience about personalisation of content and the collection of the data used for this purpose. If there is a considerable amount of content on a page that is personalised based on user data, the Council recommends that information about the personalisation should be easily discoverable and presented in an understandable way.
This is the first statement of the Council on the use of algorithms. The digital development work in the industry is ongoing, and the Council can, if necessary, decide on other aspects relating to this issue.
The Council recalls that the power of journalism in supporting a democratic and fact-based society relies on ensuring a wide range of perspectives on the world can be heard. The Council recommends that the media use algorithmic tools in a way that ensures the public can continue to receive a wide range of information about the world.