Algorithmic Accountability of Journalists
* Paper: Algorithmic Accountability: Journalistic Investigation of Computational Power Structures. By Nicholas Diakopoulos. Digital Journalism, 2015 
"How can the power of algorithms be understood and, when called for, controlled? We are only starting to understand how these strings of computer code are shaping our view of the world. As researchers point out, inherent biases in algorithms can lead to startling discriminatory possibilities, with important consequences.
Exposing the workings of algorithms to understand their deeper impact may yet become an important part of investigative journalism. In a 2015 paper published in Digital Journalism, “Algorithmic Accountability: Journalistic Investigation of Computational Power Structures,” Nicholas Diakopoulos of the University of Maryland examines journalistic strategies to gain insight about the inner workings of algorithms. While transparency of algorithms might be a first step to solve the problem, Diakopoulos is especially interested in a strategy called reverse engineering — “the process of extracting knowledge or design blueprints” by studying and then emulating the behavior of an algorithm.
The author discusses five case studies in which journalists used reverse engineering to examine algorithms, including a story in the Daily Beast, on the iPhone’s language-related algorithms; ProPublica on the 2012 U.S. election campaign and targeted email strategies; the Wall Street Journal on website pricing differentiation and on stock trading by executives; and one story by Diakopoulos himself. Based on these stories, Diakopoulos identifies the scenarios journalists typically encounter in their reporting on algorithms as well as the challenges emerging from these investigations in terms of human resources, legality and ethics.
The paper’s key points include:
- When using reverse engineering, journalists are interested in three aspects of an algorithm: the input, the output and the transformation from one to the other. There are often cases in which some elements in this relationship can or cannot be observed, and different strategies of reverse engineering may be necessary.
- When inputs are not available, “figuring out how to observe or simulate those inputs is a key part of a practical investigation…. Figuring out what the algorithm pays attention to as input becomes as intriguing a question as how the algorithm transforms input into output.”
- In this process, journalists need to keep in mind that external evidence of algorithms’ behavior might be disturbed by A/B testing — the practice of randomly assigning different treatments or content to various groups to optimize for the best response rate or return. The entities that use the algorithm are “already running experiments on their sites, and to a reverse engineer it might look like noise, or just confusing irregularities.”
Furthermore, algorithms “may be unstable and change over time, or have randomness built in to them, which makes understanding patterns in their input-output relationship much more challenging. Other tactics such as parallelization or analysis of temporal drift may be necessary in order to control for a highly dynamic algorithm.”
To successfully achieve algorithmic accountability reporting, media will have to “take dedicated efforts to teach the computational thinking, programming and technical skills needed to make sense of algorithmic decisions.” Given a legal framework that is growing more complex, “more work is also needed to explore the legal ramifications of algorithmic accountability through reverse engineering by journalists.”
New ethical questions may arise in the context of studying algorithms. The author suggests a focus on questions such as, “How might the investigation allow the algorithm to be manipulated or circumvented?” or “Who stands to benefit or suffer disadvantage from that manipulation?”
Diakopoulos underscores the computational skills needed for achieving algorithmic accountability. However, “reporting is still a key part of finding a story in a reverse-engineering analysis.” Even in an environment as technical as this, “knowing what makes something a story is perhaps less about a filter for statistical, social or legal deviance than it is about understanding the context of the phenomenon, including historical, cultural and social expectations related to the issue — all things with which traditional reporting and investigation can help.” (http://journalistsresource.org/studies/society/news-media/algorithms-journalistic-investigations-holding-digital-power-accountable)
* Report: Nicholas Diakopoulos. Algorithmic Accountability: On the Investigation of Black Boxes. Knight Foundation and the Tow Center on Digital Journalism at Columbia Journalism School.
"The past three years have seen a small profusion of websites, perhaps as many as 80, spring up to capitalize on the high interest that mug shot photos generate online. Mug shots are public record, artifacts of an arrest, and these websites collect, organize, and optimize the photos so that they’re found more easily online. Proponents of such sites argue that the public has a right to know if their neighbor, romantic date, or colleague has an arrest record. Still, mug shots are not proof of conviction; they don’t signal guilt. Having one online is likely to result in a reputational blemish; having that photo ranked as the first result when someone searches for your name on Google turns that blemish into a garish reputational wound, festering in facile accessibility. Some of these websites are exploiting this, charging people to remove their photo from the site so that it doesn’t appear in online searches.
It’s reputational blackmail. And remember, these people aren’t necessarily guilty of anything. To crack down on the practice, states like Oregon, Georgia, and Utah have passed laws requiring these sites to take down the photos if the person’s record has been cleared. Some credit card companies have stopped processing payments for the seediest of the sites. Clearly both legal and market forces can help curtail this activity, but there’s another way to deal with the issue too: algorithms. Indeed, Google recently launched updates to its ranking algorithm that down-weight results from mug shot websites, basically treating them more as spam than as legitimate information sources. With a single knock of the algorithmic gavel, Google declared such sites illegitimate. At the turn of the millennium, 14 years ago, Lawrence Lessig taught us that “code is law”—that the architecture of systems, and the code and algorithms that run them, can be powerful influences on liberty.3We’re living in a world now where algorithms adjudicate more and more consequential decisions in our lives. It’s not just search engines either; it’s everything from online review systems to educational evaluations, the operation of markets to how political campaigns are run, and even how social services like welfare and public safety are managed. Algorithms, driven by vast troves of data, are the new power brokers in society. As the mug shots example suggests, algorithmic power isn’t necessarily detrimental to people; it can also act as a positive force. The intent here is not to demonize algorithms, but to recognize that they operate with biases like the rest of us.4And they can make mistakes. What we generally lack as a public is clarity about how algorithms exercise their power over us. With that clarity comes an increased ability to publicly debate and dialogue the merits of any particular algorithmic power. While legal codes are available for us to read, algorithmic codes are more opaque, hidden behind layers of technical complexity. How can we characterize the power that various algorithms may exert on us? And how can we better understand when algorithms might be wronging us? What should be the role of journalists in holding that power to account? In the next section I discuss what algorithms are and how they encode power. I then describe the idea of algorithmic accountability, first examining how algorithms problematize and sometimes stand in tension with transparency. Next, I describe how reverse engineering can provide an alternative way to characterize algorithmic power by delineating a conceptual model that captures different investigative scenarios based on reverse engineering algorithms’ input-output relationships. I then provide a number of illus trative cases and methodological details on how algorithmic accountability reporting might be realized in practice. I conclude with a discussion about broader issues of human resources, legality, ethics, and transparency."