Black Box Society

From P2P Foundation
Jump to navigation Jump to search


* Book: Frank Pascuale. The Black Box Society: The Secret Algorithms That Control Money and Information.

URL = http://www.hup.harvard.edu/catalog.php?isbn=9780674368279

" ... examines and critiques the ways in which big data is being used to analyse, predict and control our behaviour". [1]


Contextual Citations

1.

… runaway algorithms … take on very important decisions …. autonomous weapon systems could accidentally trigger skirmishes or even wars, based on misinterpreted signals.

… it’s not only – and often not primarily – the algorithms, or even the programmers of algorithms, who are to blame …. “data-driven” algorithms that are supposedly objective and serving customers and users, are in fact biased and working only to boost the fortunes of an elite." (http://www.zoeticnetworks.com/2015/02/02/the-black-box-society-new-machine-age-of-algorithms-and-bots-behind-the-scenes-of-corporate-america/)

2.

… people consider being on top of Twitter’s trending topics, or Google or Amazon search results, an important bragging right. But if these results are relatively easy to manipulate, or are really dictated by the corporate interests of the big Internet firms, they should be seen less as the “voice of the people” than as a new form of marketing. Or, to use Rob Walker’s term, “murketing.” (http://www.zoeticnetworks.com/2015/02/02/the-black-box-society-new-machine-age-of-algorithms-and-bots-behind-the-scenes-of-corporate-america/)


Description

'Every day, corporations are connecting the dots about our personal behavior—silently scrutinizing clues left behind by our work habits and Internet use. The data compiled and portraits created are incredibly detailed, to the point of being invasive. But who connects the dots about what firms are doing with this information? The Black Box Society argues that we all need to be able to do so—and to set limits on how big data affects our lives.

Hidden algorithms can make (or ruin) reputations, decide the destiny of entrepreneurs, or even devastate an entire economy. Shrouded in secrecy and complexity, decisions at major Silicon Valley and Wall Street firms were long assumed to be neutral and technical. But leaks, whistleblowers, and legal disputes have shed new light on automated judgment. Self-serving and reckless behavior is surprisingly common, and easy to hide in code protected by legal and real secrecy. Even after billions of dollars of fines have been levied, underfunded regulators may have only scratched the surface of this troubling behavior.

Frank Pasquale exposes how powerful interests abuse secrecy for profit and explains ways to rein them in. Demanding transparency is only the first step. An intelligible society would assure that key decisions of its most important firms are fair, nondiscriminatory, and open to criticism. Silicon Valley and Wall Street need to accept as much accountability as they impose on others."

Discussion

John Danaher (in a discussion of a Pascuale essay on the Scored Society:

"Scoring systems are now everywhere, from Tripadvisor and Amazon reviews, to Rate my Professor and GP reviews on the NHS. Some of these scoring systems are to be welcomed. They often allow consumers and users of services to share valuable information. And they sometimes allow for productive feedback-loop between consumers and providers of services. The best systems seem to work on either a principle of equality — where everyone is allowed to input data and have their say — or a reversal of an inequality of power — e.g. where the less powerful consumer/user is allowed to push back against the more powerful producer.

But other times scoring systems take on a more sinister vibe. This usually happens when the scoring system is used by some authority (or socially powerful entity) to shape or control the behaviour of those being scored. For example, the use of scoring systems by banks and financial institutions to restrict access to credit, or by insurance companies to increase premiums. The motives behind these scoring systems are understandable: banks want to reduce the risk of a bad debt, insurance companies want enough money to cover potential payouts (and to make a healthy profit for themselves). But their implementation is more problematic.

The main reason for this has to do with their hidden and often secretive nature. Data is collected without notice; the scoring algorithm is often a trade secret; and the effect of the scores on an individual’s life is often significant. Even more concerning is the way in which humans are involved in the process. At the moment, there are still human overseers, often responsible for coding the scoring algorithms and using the scores to make decisions. But this human involvement may not last forever. As has been noted in the debate about drone warfare, there are three kinds of automated system:

Human-in-the-loop Systems: These are automated systems in which an input from a human decision-maker is necessary in order for the system to work, e.g. to programme the algorithm or to determine what the effects of the score will be. Human-on-the-loop Systems: These are automated systems which have a human overseer or reviewer. For example, an online mortgage application system might generate a verdict of “accept” or “reject” which can then be reviewed or overturned by a human decision-maker. The automated system can technically work without human input, but can be overridden by the human decision-maker. Human-out-of-the-loop Systems: This is a fully automated system, one which has no human input or oversight. It can collect data, generate scores, and implement decisions without any human input.

By gradually pushing human decision-makers off the loop, we risk creating a “black box society”. This is one in which many socially significant decisions are made by “black box AI”. That is: inputs are fed into the AI, outputs are then produced, but no one really knows what is going on inside. This would lead to an algocracy, a state of affairs in which much of our lives are governed by algorithms." (http://philosophicaldisquisitions.blogspot.be/2014/10/can-procedural-due-process-combat.html)


More Information