Difference between revisions of "Category:Protocols and Algorithms"
Line 2: | Line 2: | ||
− | = | + | =Quotes= |
"We need to ask then not only how algorithmic automation works today (mainly in terms of control and monetization, feeding the debt economy) but also what kind of time and energy it subsumes and how it might be made to work once taken up by different social and political assemblages—autonomous ones not subsumed by or subjected to the capitalist drive to accumulation and exploitation." | "We need to ask then not only how algorithmic automation works today (mainly in terms of control and monetization, feeding the debt economy) but also what kind of time and energy it subsumes and how it might be made to work once taken up by different social and political assemblages—autonomous ones not subsumed by or subjected to the capitalist drive to accumulation and exploitation." | ||
- Tiziana Terranova [http://www.euronomade.info/?p=2268] | - Tiziana Terranova [http://www.euronomade.info/?p=2268] | ||
+ | |||
==Anjana Susarla on the New [[Algorithmic Divide]]== | ==Anjana Susarla on the New [[Algorithmic Divide]]== | ||
Line 12: | Line 13: | ||
"Many people now trust platforms and algorithms more than their own governments and civic society. An October 2018 study suggested that people demonstrate “algorithm appreciation,” to the extent that they would rely on advice more when they think it is from an algorithm than from a human. In the past, technology experts have worried about a “digital divide” between those who could access computers and the internet and those who could not. Households with less access to digital technologies are at a disadvantage in their ability to earn money and accumulate skills. But, as digital devices proliferate, the divide is no longer just about access. How do people deal with information overload and the plethora of algorithmic decisions that permeate every aspect of their lives? The savvier users are navigating away from devices and becoming aware about how algorithms affect their lives. Meanwhile, consumers who have less information are relying even more on algorithms to guide their decisions." | "Many people now trust platforms and algorithms more than their own governments and civic society. An October 2018 study suggested that people demonstrate “algorithm appreciation,” to the extent that they would rely on advice more when they think it is from an algorithm than from a human. In the past, technology experts have worried about a “digital divide” between those who could access computers and the internet and those who could not. Households with less access to digital technologies are at a disadvantage in their ability to earn money and accumulate skills. But, as digital devices proliferate, the divide is no longer just about access. How do people deal with information overload and the plethora of algorithmic decisions that permeate every aspect of their lives? The savvier users are navigating away from devices and becoming aware about how algorithms affect their lives. Meanwhile, consumers who have less information are relying even more on algorithms to guide their decisions." | ||
{https://www.fastcompany.com/90336381/the-new-digital-divide-is-between-people-who-opt-out-of-algorithms-and-people-who-dont?) | {https://www.fastcompany.com/90336381/the-new-digital-divide-is-between-people-who-opt-out-of-algorithms-and-people-who-dont?) | ||
+ | |||
+ | |||
+ | ==[[Privacy is a Public Good]]== | ||
+ | |||
+ | "How do we manage consent when data shared by one affects many? Take the case of DNA data. Should the decision to share data that reveals sensitive information about your family members be solely up to you? Shouldn’t they get a say as well? If so, how do you ask for consent from unborn future family members? | ||
+ | How do we decide on data sharing and collection when the externalities of those decisions extend beyond the individual? What if data about me, a thirty-something year old hipster, could be used to reveal patterns about other thirty-something year old hipsters? Patterns that could result in them being profiled by insurers or landlords in ways they never consented to. How do we account for their privacy? | ||
+ | The fact that one person’s decision about data sharing can affect the privacy of many motivates Fairfield and Engel to argue that privacy is a public good: “Individuals are vulnerable merely because others have been careless with their data. As a result, privacy protection requires group coordination. Failure of coordination means a failure of privacy. In short, privacy is a public good.” As with any other public good, privacy suffers from a free rider problem. As observed by the authors, when the benefits of disclosing data outweigh the risks for you personally, you are likely to share that data - even when doing so presents a much larger risk to society as a whole." | ||
+ | - Anouk Ruhaak [https://foundation.mozilla.org/en/blog/when-one-affects-many-case-collective-consent/] | ||
=Key Resources= | =Key Resources= |
Revision as of 06:14, 9 November 2020
New section, created July 2017: how do protocols and algorithms increasingly govern our world, for good or ill; and how we can change it, for example through Design Justice
Contents
Quotes
"We need to ask then not only how algorithmic automation works today (mainly in terms of control and monetization, feeding the debt economy) but also what kind of time and energy it subsumes and how it might be made to work once taken up by different social and political assemblages—autonomous ones not subsumed by or subjected to the capitalist drive to accumulation and exploitation."
- Tiziana Terranova [1]
Anjana Susarla on the New Algorithmic Divide
"Many people now trust platforms and algorithms more than their own governments and civic society. An October 2018 study suggested that people demonstrate “algorithm appreciation,” to the extent that they would rely on advice more when they think it is from an algorithm than from a human. In the past, technology experts have worried about a “digital divide” between those who could access computers and the internet and those who could not. Households with less access to digital technologies are at a disadvantage in their ability to earn money and accumulate skills. But, as digital devices proliferate, the divide is no longer just about access. How do people deal with information overload and the plethora of algorithmic decisions that permeate every aspect of their lives? The savvier users are navigating away from devices and becoming aware about how algorithms affect their lives. Meanwhile, consumers who have less information are relying even more on algorithms to guide their decisions." {https://www.fastcompany.com/90336381/the-new-digital-divide-is-between-people-who-opt-out-of-algorithms-and-people-who-dont?)
Privacy is a Public Good
"How do we manage consent when data shared by one affects many? Take the case of DNA data. Should the decision to share data that reveals sensitive information about your family members be solely up to you? Shouldn’t they get a say as well? If so, how do you ask for consent from unborn future family members? How do we decide on data sharing and collection when the externalities of those decisions extend beyond the individual? What if data about me, a thirty-something year old hipster, could be used to reveal patterns about other thirty-something year old hipsters? Patterns that could result in them being profiled by insurers or landlords in ways they never consented to. How do we account for their privacy? The fact that one person’s decision about data sharing can affect the privacy of many motivates Fairfield and Engel to argue that privacy is a public good: “Individuals are vulnerable merely because others have been careless with their data. As a result, privacy protection requires group coordination. Failure of coordination means a failure of privacy. In short, privacy is a public good.” As with any other public good, privacy suffers from a free rider problem. As observed by the authors, when the benefits of disclosing data outweigh the risks for you personally, you are likely to share that data - even when doing so presents a much larger risk to society as a whole." - Anouk Ruhaak [2]
Key Resources
Key Articles
- Notes on Design Justice and Digital Technologies. By Sasha Costanza-Chock. [3]: an introduction to:
- Technologically Coded Authority: The Post-Industrial Decline in Bureaucratic Hierarchies. By A. Aneesh [4]
Key Books
- The Age of Surveillance Capital. By Shoshana Zuboff. Profile, 2019
- The Bleeding Edge. Why Technology Turns Toxic in an Unequal World. By Bob Hughes. New Internationalist Books, 2016 [5]
- Recursivity and Contingency. By Yuk Hui. Rowman & Littlefield International (2019)
[6]. Recommended by Bernard Stiegler: "Through a historical analysis of philosophy, computation and media, this book proposes a renewed relation between nature and technics." For details see: Towards a Renewed Relation Between Nature and Technics.
Pages in category "Protocols and Algorithms"
The following 141 pages are in this category, out of 141 total.
A
- Accountable Algorithms
- Age of Surveillance Capitalism
- AI Ethics Guidelines Global Inventory
- Alexander Galloway
- Alexander Galloway on Protocollary Power
- Algocracy
- Algocratic Governance
- Algocratic Modes of Organization for Global Labor Coordination
- Algorithm Observatory
- Algorithm Watch
- Algorithmic Accountability of Journalists
- Algorithmic Divide
- Algorithmic Fairness
- Algorithmic Food Justice
- Algorithmic Management
- Algorithmic Management in the Workplace
- Algorithmic Mechanism Design
- Algorithmic Policing
- Algorithmic Sovereignty
- Algorithmically–Defined Audiences
- Algorithms of Oppression
- Algorithms to Improve Labor and Union Bargaining Outcomes
- Algorithms, Capital, and the Automation of the Common
- Algotransparency
- Architectures of Control
- Automatic Society
- Automating Inequality
- Autonomy and Algorithmic Control in the Global Gig Economy
C
- Captology
- Carlos Castillo on How Algorithm Bias Impacts Fairness and Accessibility
- Case Studies Exploring Principles for Data Stewardship
- Cathy O'Neil on Algorithms as Harmful Weapons of Math Destruction
- Citizens Evolving from Data Providers to Decision-Makers in Barcelona
- Code Space
- Collective Consent
- Colonized by Data
- Computing Regime
- Consensus Algorithms in Public Blockchains
- Cooperating with Algorithms in the Workplace
- Critical Political Economy of Design
- Culturally Situated Design Tools
- Cybernetic Balance
- Cybernetics and Governance
- Cybernetics as an Antihumanism
- Cybernetics of the Commons
- Cybernetics Valuable to the Commons and for Understanding AI
D
E
F
H
I
J
O
P
- Panoptic Governance
- Persuasive Technology Lab
- Politics of Code in Web 2.0
- Post-Industrial Decline in Bureaucratic Hierarchies
- Principles for Accountable Algorithms
- Privacy as a Public Good
- Protocol
- Protocol Cooperativism
- Protocol Politics
- Protocollary Ownership
- Protocollary Power
- Protocols
- Protocols for P2P Democracy in Distributed Networks
- Protological Control
- Public Interest Algorithms
R
S
- Shoshana Zuboff on the Economics and Perils of Surveillance Capitalism
- Silent Works
- Social Architecture
- Social Impact Statement for Algorithms
- Social Machines
- Socially Robust and Enduring Computing
- Society of Control
- Software and Sovereignty
- Surveillance Capitalism and the Prospects of an Information Civilization
- Syllabus on Big Data and Digital Methods