Mutual Aid for Accountability

From P2P Foundation
Jump to navigation Jump to search


Description

J. NATHAN MATIAS:

"Mechanical Turk, which has no minimum wage, is a free market for digital labor. Only the collective decisions of workers and requesters determine wages and working conditions. Since Amazon provides no way for workers to rate employers, workers can’t always anticipate if they will be treated well or paid fairly. As a result, making a living on Mechanical Turk is a precarious venture, with few company policies and a mostly hands-off attitude from Amazon.


Milland and other regular Turkers navigate this precariously free market with Turkopticon, a DIY technology for rating employers created in 2008. To use it, workers install a browser plugin that extends Amazon's website with special rating features. Before accepting a new task, workers check how others have rated the employer. After finishing, they can also leave their own rating of how well they were treated.

Collective rating on Turkopticon is an act of citizenship in the digital world. This digital citizenship acknowledges that online experiences are as much a part of our common life as our schools, sidewalks, and rivers—requiring as much stewardship, vigilance, and improvement as anything else we share.

“How do you fix a broken system that isn't yours to repair?” That’s the question that motivated the researchers Lilly Irani and Six Silberman to create Turkopticon, and it’s one that comes up frequently in digital environments dominated by large platforms with hands-off policies. (On social networks like Twitter, for example, harassment is a problem for many users.) Irani and Silberman describe Turkopticon as a “mutual aid for accountability” technology, a system that coordinates peer support to hold others accountable when platforms choose not to step in.


Mutual aid accountability is a growing response to the complex social problems people face online. On Twitter, systems like The Block Bot and BlockTogether coordinate collective judgments about alleged online harassers. The systems then collectively block tweets from accounts that a group prefers not to hear from. Last month, the advocacy organization Hollaback raised over $20,0."00 on Kickstarter to create support networks for people experiencing harassment. In November, I worked with the advocacy organization Women, Action, and the Media, which took a role as "authorized reporter" with Twitter. For three weeks WAM! accepted reports, sorted evidence, and forwarded serious cases to Twitter. In response, the company warned, suspended, and deleted the accounts of many alleged harassers.


These mutual aid technologies operate in the shadow of larger systems with gaps in how people are supported—even when platforms do step in, says Stuart Geiger, a Berkeley Ph.D. student. In other words, sometimes a platform’s system-wide solutions to a problem can create their own problems. For several years, Geiger and his colleague Aaron Halfaker, now a researcher at Wikimedia, were concerned that Wikipedia’s semi-automated anti-vandalism systems might be making the site unfriendly. As a graduate student unable to change Wikipedia’s code, Halfaker created Snuggle, a mutual-aid mentorship technology that tracks the site’s spam responders. When Snuggle users think a newcomer’s edits were mistakenly flagged as spam, the software coordinates Wikipedians to help those users recover from the negative experience of getting revoked.

By organizing peer support at scale, the designers of Turkopticon and its cousins draw attention to common problems, hoping to influence longer-term change on a complex issue. In time, the idea goes, requesters on Mechanical Turk might change their treatment of workers, Amazon might change its policies and software, or regulators might set new rules for digital labor. This is an approach with a long history in an area that might seem unlikely: the conservation movement. (Silberman and Irani cite the movement as inspiration for Turkopticon.)

To better understand how this approach might influence digital citizenship, I followed the history of mutual-aid accountability in a precious common network that the city of Boston enjoys every day: the Charles River.


...


The Atlantic SUBSCRIBE SEARCH MENU The Tragedy of the Digital Commons GIVE A FREE GIFT previousnext The Tragedy of the Digital Commons Advocates for fairer, safer online spaces are turning to the conservation movement for inspiration.


Randall Munroe

1.2k J. NATHAN MATIAS JUN 8, 2015 TECHNOLOGY When her husband lost his job in 2010, Kristy Milland realized how important the Internet had become to her family's survival. For several years, the 30-something Canadian high-school graduate had a hobby of completing paid micro-tasks on Amazon's Mechanical Turk, an online marketplace that sells crowdsourced labor. She answered surveys, tagged images, and trained artificial intelligences for a few cents or dollars a task. In time, Milland became community manager of TurkerNation, one of several major forums for worker discussion and peer support.

With bills looming, Milland realized, "I had to turn this into a real gig." Now that her digital work was her family's primary income, she felt for the first time how hard it was to make ends meet. Mechanical Turk, which has no minimum wage, is a free market for digital labor. Only the collective decisions of workers and requesters determine wages and working conditions. Since Amazon provides no way for workers to rate employers, workers can’t always anticipate if they will be treated well or paid fairly. As a result, making a living on Mechanical Turk is a precarious venture, with few company policies and a mostly hands-off attitude from Amazon.


Milland and other regular Turkers navigate this precariously free market with Turkopticon, a DIY technology for rating employers created in 2008. To use it, workers install a browser plugin that extends Amazon's website with special rating features. Before accepting a new task, workers check how others have rated the employer. After finishing, they can also leave their own rating of how well they were treated.

Collective rating on Turkopticon is an act of citizenship in the digital world. This digital citizenship acknowledges that online experiences are as much a part of our common life as our schools, sidewalks, and rivers—requiring as much stewardship, vigilance, and improvement as anything else we share.

“How do you fix a broken system that isn't yours to repair?” That’s the question that motivated the researchers Lilly Irani and Six Silberman to create Turkopticon, and it’s one that comes up frequently in digital environments dominated by large platforms with hands-off policies. (On social networks like Twitter, for example, harassment is a problem for many users.) Irani and Silberman describe Turkopticon as a “mutual aid for accountability” technology, a system that coordinates peer support to hold others accountable when platforms choose not to step in.

GNU GPL Mutual aid accountability is a growing response to the complex social problems people face online. On Twitter, systems like The Block Bot and BlockTogether coordinate collective judgments about alleged online harassers. The systems then collectively block tweets from accounts that a group prefers not to hear from. Last month, the advocacy organization Hollaback raised over $20,000 on Kickstarter to create support networks for people experiencing harassment. In November, I worked with the advocacy organization Women, Action, and the Media, which took a role as "authorized reporter" with Twitter. For three weeks WAM! accepted reports, sorted evidence, and forwarded serious cases to Twitter. In response, the company warned, suspended, and deleted the accounts of many alleged harassers.


These mutual aid technologies operate in the shadow of larger systems with gaps in how people are supported—even when platforms do step in, says Stuart Geiger, a Berkeley Ph.D. student. In other words, sometimes a platform’s system-wide solutions to a problem can create their own problems. For several years, Geiger and his colleague Aaron Halfaker, now a researcher at Wikimedia, were concerned that Wikipedia’s semi-automated anti-vandalism systems might be making the site unfriendly. As a graduate student unable to change Wikipedia’s code, Halfaker created Snuggle, a mutual-aid mentorship technology that tracks the site’s spam responders. When Snuggle users think a newcomer’s edits were mistakenly flagged as spam, the software coordinates Wikipedians to help those users recover from the negative experience of getting revoked.

By organizing peer support at scale, the designers of Turkopticon and its cousins draw attention to common problems, hoping to influence longer-term change on a complex issue. In time, the idea goes, requesters on Mechanical Turk might change their treatment of workers, Amazon might change its policies and software, or regulators might set new rules for digital labor. This is an approach with a long history in an area that might seem unlikely: the conservation movement. (Silberman and Irani cite the movement as inspiration for Turkopticon.)

To better understand how this approach might influence digital citizenship, I followed the history of mutual-aid accountability in a precious common network that the city of Boston enjoys every day: the Charles River. Planned, re-routed, exploited and contested, it has inspired and supported human life since before written history.


Logan Ingalls / Flickr As early as 3200 B.C. and continuing for over 1,500 years, Native Americans re-routed the flow of water near Boston to catch fish in constructions that covered over two acres. The food and fertilizer supplied a sizable community until rising water levels made their economy unsustainable. Colonial dams and bridges were constructed on the Charles from the 1640s, and Harvard University was partly funded by ferry and bridge tolls for nearly 200 years. Across this river, Paul Revere received covert optical transmissions about British military movements from Old North Church. Two months later, British warships would sail its waters in an attempt to capture Bunker Hill. In the 19th century, Henry Wadsworth Longfellow crossed it to see his sweetheart Frances Appleton, writing of the Charles River:

As long as the heart has passions,

As long as life has woes;

The moon and its broken reflection

And its shadows shall appear,

As the symbol of love in heaven,

And its wavering image here. Longfellow’s poem didn't mention the pollution from 43 mills along the riverbanks that prompted the government to abandon the idea of cleanup efforts in 1875. Absent from his poem, too, are the chemical spills from the Watertown Arsenal, later designated a superfund site by the Environmental Protection Agency, or the municipal sewage systems that fed directly into the river. Nor was this problem solely created by institutions. In the 1950s, when the river's toxic pink and orange waters were closed to swimmers, Bernard DeVoto described an informal landfill along the river in Harper's Magazine as “Hell's Half Acre.” Urging Bostonians to take action, DeVoto lamented that the river had become “foul and noisome, polluted by offal and industrial wastes, scummy with oil, unlikely to be mistaken for water.”

When the ecologist Garrett Hardin set to write his famous 1968 article on problems with “no technical solution,” “The Tragedy of The Commons,” he could have been describing the Charles. Hardin imagines open grazing areas managed by multiple herders who destroy their precious common when each rationally seeks to maximize personal gain. The problems of digital labor can also be interpreted through this tragedy. With a Mechanical Turk worker turnover rate of 69 percent every six months, requesters tend to seek the minimum price for someone’s labor, and workers compete for diminishing pay. With minimal accountability for the companies requesting work and limited intervention from Amazon, attractive stories of flexible, livable income from digital labor remain as partially true as Longfellow’s poetic image of the beautiful river Charles.

Academics advancing the idea of digital commons have tended to focus on how to prevent or regulate these problems—after they're identified. In Code and Other Laws of Cyberspace, Larry Lessig describes software design as a kind of regulation separate from top-down policies or community norms. Sixteen years after Lessig’s book, belief in the power of code and social psychology to shape successful online communities is widespread among the design teams who govern our digital lives. Their growing toolbox of design options is detailed in a recent law review article by James Grimmelman, who covers everything from banning and shaming to reputation and rewards. In this view, perhaps Mechanical Turk could become fairer if Amazon added the right buttons, set the right default wage, or changed its design to activate just the right motivations. “If they understand we're human, they will treat us better.” If code is law online and platform designers are its legislators, who identifies the problems and sets the goals for those laws? Throughout her career, the Nobel Prize-winning economist Elinor Ostrom studied the successes of monitoring programs run by the communities connected to the resources they share. Like Hardin, Ostrom saw no single technical or legal solution to the complex problems of common resources. Yet in place of Hardin's selfish freeloading herders, Ostrom described a cooperative and “co-evolutionary race,” a struggle among the frenemies who share common spaces. Ostrom observed that in well-managed common resources, monitoring is part of wider systems to hold each other in check for the common good. On the Charles River, this monitoring keeps people safe while also supporting long-term change.

Two summers ago, officials declared the Charles River safe for public swimming after a 50-year ban. Throughout that period, boaters kept each other safe through community monitoring. Every summer morning, boathouses along the river raise colored flags to signal the water's estimated bacteria level. A red flag signals high levels of of e. coli, a blue flag means a river safe for boating, and a yellow flag warns of inconclusive statistical predictions. Behind each morning’s colored flag is the story of a decades-long struggle among citizen groups, scientists, planners, local companies, and government to reverse the tragedy of the Charles.

Where Turkers click buttons to rate employers, the river’s users dunk bottles to monitor water quality. Every month, teams of volunteers on bridges, banks, and boats drop open bottles into the Charles River at 30 locations along its 80-mile length. After capping the bottles and taking notes on river flow and the weather, volunteers pass samples to the Charles River Watershed Association (CRWA), a ritual they have maintained for 20 years. CRWA staff scientists analyze the samples and add the data to statistical models they developed to predict river bacteria. By using a statistical model of weather and river temperature, CRWA can offer daily predictions of boating safety. When conditions are less predictable, the boathouses that “publish” this data-visualization fly a yellow flag to signal that uncertainty. Data from the samples is also used to hold polluters accountable and advocate for change.


Courtesy Francisco Peri Successful mutual aid doesn't guarantee wider change. The creators of Turkopticon sometimes worry that patching up problems for only some people might make wider change even harder to achieve. Seven years later, Amazon still hasn't added work-requester accountability to their platform. (A spokesperson told me that “our customers are telling us this feature is valuable and we’re looking at ways in which we can offer it.”) For now, new workers are expected to find and use Turkopticon on their own. On the Charles River, however, mutual aid accountability has been key to its transformation.

“Our policy work is based on our science,” says Margaret Van Deusen, the director of Charles River Watershed Association's law, advocacy, and policy work. Cleaning up a river requires careful monitoring to identify sources of problems and judge the effectiveness of new cleanup ideas. Founded in 1965, the CRWA has cajoled and supported federal and local government to clean up pollution along the river, including a military research lab, hospital waste, and sewer systems. According to Elisabeth Ciancola, the aquatic scientist who manages the citizen monitoring program, citizen data from all those bottles helped CRWA successfully advocate to close sewer runoffs along the river, reducing pathogen-rich overflows by 98 percent.

Online, though, it is hard to bottle a representative sample of a common problem. When monitoring a river, volunteers can take samples at carefully-specified locations without trespassing or violating privacy. Those samples can be analyzed using standardized methods with verified accuracy. In contrast, mutual-aid accountability systems like Turkopticon are dependent on those who use them and on the subjective judgments of the people who provide mutual aid. “Turkopticon's intents are great,” says Kristy Milland, the digital worker in Canada. Ratings are “a good suggestion but not necessarily 100 percent accurate.”

This subjective knowledge might be an advantage, says Stuart Geiger, the researcher who studies Wikipedia. Quoting the science and technology scholar Donna Haraway, Geiger says that “situated knowledge” from the people facing a problem can give us “a more adequate, richer, better account of a world, in order to live it well.” Geiger's own Wikipedia research merges this situated knowledge with quantitative methods that are designed to offer a representative understanding of behavior on the site.

But citizen monitoring is only effective at wider change when monitoring groups are able to convince powerful entities to take them seriously. Building on their citizen science, the CRWA calculated a “pollution budget” of how much the river could handle, an idea that became part of state pollution laws in 2011. The metric for “total maximum daily load” gives advocates an opportunity to educate planners on the consequences of their proposals and hold them accountable for those consequences, says Van Deusen, the CRWA director. Other situations require powerful allies. In a recent controversy over medical waste from an abandoned hospital, the CRWA and a local city council successfully pressured the state to expand its cleanup measures.

“Even if Twitter or Mechanical Turk or Wikipedia die in 10 years and something new replaces all of them, we're still going to have these issues.” (http://www.theatlantic.com/technology/archive/2015/06/the-tragedy-of-the-digital-commons/395129/)