By A. S. Pannerselvan
Some of the responses to my critique of the social media in general, and large Internet media companies in particular, may give the impression that I am a sort of Luddite, sceptical of new technology. As a defender of journalistic practices, my concerns are manifold and technology is just one of them. I am acutely aware of the many wonders gifted by digital connectivity. But as a person dealing with the ethics of public discourse, my reservations are about the darker side of technology and the need for every citizen to be conscious of the intrusions due to the power of algorithms that govern the digital economy. Though the leap made by technology is, indeed, not reversible, it would certainly enrich the readers to be aware of the power of algorithms in reinforcing social stereotypes and prejudices.
In the United States, researchers have documented some really disturbing trends. A study by Carnegie Mellon University established that Google’s online advertising system showed an advertisement for high-income jobs to men much more often than to women, for instance. A Harvard study revealed that searches regarding arrest records were more likely to show up details of distinctively black names or a historically black fraternity. Claire Cain Miller, in her article ‘When Algorithms Discriminate’, rightly points out that targetting ads is legal, but discriminating on the basis of gender is not.
The Tow Center for Digital Journalism at Columbia Journalism School has documented some of the instances where algorithms have failed. The one that stands out is the denial of a license to twins on the basis of a fraud deduction algorithm. “Two teenage twins walk into the Department of Motor Vehicles (DMV) in Georgia. Two teenage twins leave without their driver’s permits. What happened? An algorithm, that’s what. DMVs nationwide are adopting automated fraud detection systems that use computer vision algorithms to detect whether they think someone is trying to get a new license under an assumed name. The algorithm couldn’t figure out the difference between the twins and thought one of them a fraud,” points out Nicholas Diakopoulos in the piece ‘Algorithm everywhere’ for the Tow Center.
Over the last five years, I have been following two important media scholars to understand the extent and the nature of technological disruptions: George Brock, who wrote the insightful book Out of print, and Emily Bell, the current director of the Tow Center for Digital Journalism at the Columbia University Graduate School of Journalism. Bell played a key role in transforming the digital presence of The Guardian. While acknowledging the transformational power of digital technology, these two scholars also constantly remind us of a virtue that cannot be sacrificed at the altar of technology: accountability.
Earlier, Brock raised a pertinent question with Andy Mitchell, Facebook’s director of news and global media partnerships, as to how Facebook sees and handles its role as a news gatekeeper, influencing both the detail and flow of what people see. His direct question was whether Facebook was in anyway accountable to its community for the integrity of its news feed. Instead of answering this question, the Facebook official reiterated that Facebook wanted people to have a “great experience” and that the feed gives them “what they’re interested in”. “Facebook is not, and knows quite well that it is not, a neutral machine passing on news. Its algorithm chooses what people see, it has “community standards” that material must meet, and it has to operate within the laws of many countries. The claim that Facebook doesn’t think about journalism has to be false,” observes Prof. Brock.
According to Bell, the pressing question for Facebook — and eventually for Google — is who bears the publishing risk in this new world? For instance, she looked at a story that is found through a link. Here the platform company has limited risk if challenges are made to the content. Bell questions what would happen when there is an explicit agreement to republish material on a platform built for virality — who bears responsibility for defending and protecting the journalism? She is convinced that “the locus of power in delivery and distribution of news has shifted towards commercial companies who have priorities that often compete with those of journalism.” Bell’s caution: “The alternative is unclear, but must ultimately lie with news taking more responsibility for understanding the role of third-party technology and creating its own platforms in the future. How journalism will find the time or resources to do this is unclear, as the frenemy is already at the gate.”
If we respect journalism’s five inviolable principles — truth and accuracy, independence, fairness and impartiality, humanity and accountability — then the question of how to deal with the present technological disruption is not a Luddite position but a morally compelling existential question.
This column was originally published in the Hindu on 23 November 2015.