Sunday, June 3, 2018

NYRB: "Big Brother Goes Digital" (Brynjolfsson, McAfee and marketing the surveillance state)

From the New York Review of Books, May 24:
Self-Tracking
by Gina Neff and Dawn Nafus
MIT University Press, 248 pp., $15.95 (paper) 
Sociometric Badges: State of the Art and Future Applications
by Daniel Olguín Olguín and Alex (Sandy) Pentland
IEEE 11th International Symposium on Wearable Computers, Boston, October 2007, available at vismod.media.mit.edu/tech-reports/TR-614.pdf
 
Machine, Platform, Crowd: Harnessing Our Digital Future
by Andrew McAfee and Erik Brynjolfsson
Norton, 402 pp, $29.95 

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
by Erik Brynjolfsson and Andrew McAfee
Norton, 306 pp, $26.95
In her seminal work The Managed Heart: Commercialization of Human Feeling (1983), the sociologist Arlie Russell Hochschild described a workplace practice known as “emotional labor management.” Hochschild was studying the extreme kinds of “emotional labor” that airline stewardesses, bill collectors, and shop assistants, among others, had to perform in their daily routines. They were obliged, in her words, “to induce or suppress feeling in order to sustain the outward countenance that produces the proper state of mind in others.” In the case of airline stewardesses, the managers and human resources staff of the airline companies relied on reports from passengers or management spies to make sure that stewardesses kept up their cheerful greetings and radiant smiles no matter what.

The stewardesses Hochschild studied were working under a regime of “scientific management,” a workplace control system conceived in the 1880s and 1890s by the engineer Frederick Winslow Taylor. Workers subject to such regimes follow precise, standardized routines drawn up by managers and undergo rigorous monitoring to ensure that these routines are followed to the letter. Taylor’s practice is often associated with such factory workplaces as the early Ford Motor plants or today’s Amazon “fulfillment centers,” where workers must perform their prescribed tasks on a strict schedule.

Hochschild showed that regimes of scientific management could be applied virtually anywhere. Her airline company managers aspired to control every aspect of their employees’ emotional conduct. What kept them from doing so was that they weren’t actually present in plane cabins during flights and so had to rely on haphazard reporting to confirm that the stewardesses were always behaving as they should. But in the twenty-first century, new technologies have emerged that enable companies as varied as Amazon, the British supermarket chain Tesco, Bank of America, Hitachi, and the management consultants Deloitte to achieve what Hochschild’s managers could only imagine: continuous oversight of their workers’ behavior.

These technologies are known as “ubiquitous computing.” They yield data less about how employees perform when working with computers and software systems than about how they behave away from the computer, whether in the workplace, the home, or in transit between the two. Many of the technologies are “wearables,” small devices worn on the body. Consumer wearables, from iPhones to smart watches to activity trackers like Fitbit, have become a familiar part of daily life; people can use them to track their heart rate when they exercise, monitor their insulin levels, or regulate their food consumption.

The new ubiquity of these devices has “raised concerns,” as the social scientists Gina Neff and Dawn Nafus write in their recent book Self-Tracking—easily the best book I’ve come across on the subject—“about the tremendous power given to already powerful corporations when people allow companies to peer into their lives through data.” But the more troubling sorts of wearables are those used by companies to monitor their workers directly. This application of ubiquitous computing belongs to a field called “people analytics,” or PA, a name made popular by Alex “Sandy” Pentland and his colleagues at MIT’s Media Lab.

Pentland has given PA a theoretical foundation and has packaged it in corporate-friendly forms. His wearables rely on many of the same technologies that appear in Self-Tracking, but also on the sociometric badge, which does not. Worn around the neck and attached to microphones and sensors, the badges record their subjects’ frequency of speaking, tone of voice, facial expressions, and body language. In Sociometric Badges: State of the Art and Future Applications (2007), Pentland and his colleague Daniel Olguín Olguín explained that the badges “automatically measure individual and collective patterns of behavior, predict human behavior from unconscious social signals, identify social affinity among individuals…and enhance social interactions by providing feedback.”
The badges and their associated software are being marketed by Humanyze, a Boston company cofounded by Pentland, Olguín Olguín, and Ben Waber among others (Waber was formerly one of Pentland’s researchers at MIT and is now the company’s CEO). Under its original name, Sociometric Solutions, the company got early commissions from the US Army and Bank of America. By 2016 Humanyze had among its clients a dozen Fortune 500 companies and Deloitte. In November 2017 it announced a partnership with HID Global, a leading provider of wearable identity badges, which allows HID to incorporate Humanyze’s technologies into its own products and so expands the use of such badges by US businesses.

The main tool in Humanyze’s version of PA is a digital diagram in which people wearing sociometric badges are represented by small circles arrayed around the circumference of a sphere, rather like the table settings for diners at a banquet. Each participant is linked to every other one by a straight line, the thickness of which depends on what the system considers the “quality” of their relationship based on the data their badges collect.

In a 2012 essay for the Harvard Business Review, Pentland described how this method was used to evaluate the performance of employees at a business meeting in Japan.1 The PA diagram for Day One showed that the lines emanating from two members of an eight-person team, both of whom happened to be Japanese, were looking decidedly thin. But by Day Seven, the diagrams were showing that the “Day 1 dominators” had “distributed their energy better” and that the two Japanese members were “contributing more to energy and engagement.” Evidently some determined managerial nudging had taken place between Days One and Seven. In a June 2016 interview with MEL Magazine, Waber claimed that little escapes the gaze of the sociometric badge and its associated technologies: “Even when you’re by yourself, you’re generating a lot of interesting data. Looking at your posture is indicative of the kind of work and the kind of conversation you’re having.”2

In a 2008 article Pentland commended his PA systems for being more rational and dependable than their human counterparts.3 But the “intelligence” of his and Waber’s PA systems is not that of disembodied artificial intelligence—whatever that may look like—but of corporate managers with certain ideas about how their subordinates should behave. The managers instruct their programmers to create algorithms that in turn embed these managerial preferences in the operations of the PA systems. Pentland and Waber’s PA regime is in fact a late variant of scientific management and descends directly from the “emotional labor management” Hochschild discussed in The Managed Heart. But these twenty-first-century systems have powers of surveillance and control that the HR managers of the airline companies thirty years ago could only dream of.

Not all PA systems depend on wearable devices. Some target landlines and cell phones. Behavox, a PA company financed by Citigroup, specializes in the surveillance of employees in financial services. “Emotion recognition and mapping in phone calls is increasingly something that banks really want from us,” Erkin Adylov, the company’s CEO, told a reporter in 2016.4 Behavox’s website advertises that its systems give “real-time and automatic tracking” of aspects of employee conversation like the “variability in the timing of replies, frequency in communications, use of emoticons, slang, sentiment and banter.” The company, in the words of a recent Bloomberg report,
scans petabytes of data, flagging anything that deviated from the norm for further investigation. That could be something as seemingly innocuous as shouting on a phone call, accessing a work computer in the middle of the night, or visiting the restroom more than colleagues.5
“If you don’t know what your employees are doing,” Adylov told another reporter in 2017, “then you’re vulnerable.”

Most PA software providers rely on combinations of wearables and computer-based technologies to monitor and control workplace behavior. These companies boast that their systems can find out virtually everything there is to know about employees, both in the workplace and outside it. “Thanks to modern technology,” in the words of Hubstaff, a PA company based in Indianapolis, “companies can monitor almost 100 percent of employee activity and communication.”6

Max Simkoff, the cofounder of San Francisco’s Evolv Corporation (now taken over by Cornerstone, another Humanyze competitor), has said that his PA systems can analyze more than half a billion employee data points across seventeen countries and that “every week we figure out more things to track.” Kronos Incorporated, a management software firm based in Lowell, Massachusetts, claims that its workforce management systems are used daily by “more than 40 million people” and offer “immediate insight into…productivity metrics at massive scale.”7

Microsoft entered the PA market when it acquired the Seattle-based company Volometrix in 2015. It inherited Volometrix’s “Network Efficiency Index” (NEI), which measures how efficiently employees build and maintain their “internal networks.” The index is calculated by dividing “the total number of hours spent emailing and meeting with other employees” by the number of “network connections” an employee manages to secure. The NEI’s recognition of an employee’s network connection depends on whether encounters with coworkers have met both a “frequency of interaction threshold” and “an intimacy of interaction threshold,” the latter of which is satisfied when there are “2 or more interactions per month which include 5 or fewer people total.”8

When workers fail to meet these thresholds, other workplace technologies can be enlisted to give them a nudge. One Humanyze client created a robotic coffee machine that responded to data collected from sociometric badges worn by nearby employees. By connecting to Humanyze’s Application Programming Interface (API), the coffee machine could assess when a given group of workers needed to interact more; it would then wheel itself to wherever it could best encourage that group to mingle by dispensing lattes and cappuccinos.9

When American managers want to install PA surveillance systems, employees rarely manage to stop them. In Britain, an exception to this trend occurred in January 2016, when journalists at the London office of the Daily Telegraph came to work one Monday and found that management had affixed small black boxes on the undersides of their desks that used heat and motion sensors to track whether or not they were busy at any given time. Seamus Dooley of the UK National Union of Journalists told The Guardian that “the NUJ will resist Big Brother–style surveillance in the newsroom.” The boxes were removed.10

The Telegraph’s journalists were right to act as they did. A 2017 paper by the National Workrights Institute in Washington, D.C.,11 cites a wealth of academic research on the physical and psychological costs that intrusive workplace monitoring can have on employees. A study by the Department of Industrial Engineering at the University of Wisconsin has shown that the introduction of intense employee monitoring at seven AT&T-owned companies led to a 27 percent increase in occurrences of pain or stiffness in the shoulders, a 23 percent increase in occurrences of neck pressure, and a 21 percent increase in back pain. Other research has suggested that the psychological effects of these technologies can be equally severe. Many of Bell Canada’s long-distance and directory assistance employees have to meet preestablished average work times (AWTs). Seventy percent of the workers surveyed in one study reported that they had “difficulty in serving a customer well” while “still keeping call-time down,” which they said contributed to their feelings of stress to “a large or very large extent.”

How have the corporate information-technology community and its academic allies justified these practices and the violations of human dignity and autonomy they entail? Among economists, Erik Brynjolfsson at MIT is perhaps the leading counsel for the defense. With Andrew McAfee, also of MIT, he has published two books to this end, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies (2014) and Machine, Platform, Crowd: Harnessing Our Digital Future (2017), the latter clearly written with a corporate audience in mind.

In the opening chapter of Machine, Platform, Crowd, they write that “our goal for this book is to help you.” The “you” in question is a corporate CEO, CIO, or senior executive who might be saddled with obsolete technologies—in Brynjolfsson and McAfee’s words, “the early-twenty-first-century equivalent of steam engines.” Each subsequent chapter ends with a series of questions aimed at such readers: “Are you systematically and rigorously tracking the performance over time of your decisions?”

Although the use of information technology in the workplace is a dominant theme of Brynjolfsson and McAfee’s two books, the authors say nothing about the surveillance powers of People Analytics or its predecessors, whose existence cannot easily be reconciled with the glowing vision they describe in the opening chapters of The Second Machine Age. There are, for instance, eighteen references to Amazon in The Second Machine Age and Machine, Platform, Crowd. All of them are to technological breakthroughs like the company’s “recommendation engine,” which reduces search costs so that “with a few checks over two million books can be found and purchased.”...MUCH MORE