Saturday, December 11, 2021

Nudge This: "The Algorithmic Self"

First posted March 16, 2015. The writer, Frank Pasquale is Professor of Law at Brooklyn Law School and author of The Black Box Society: The Secret Algorithms That Con-
trol Money and Information
.

And, on the off chance Bloomberg's Matt Levine should see this, 38 footnotes!

From The Hedgehog Review, Spring 2015:

At a recent conference on public health, nutrition expert Kelly Brownell tried to explain our new food environment by making some striking comparisons. First, he contrasted the coca leaf—chewed for pain relief for thousands of years by indigenous people in South America, with little ill effect—with cocaine, a highly addictive, mind-altering substance. Then he contrasted a cob of corn with a highly processed piece of candy derived from corn syrup. Nutritious in its natural state, the concentrated sugar in corn can spark unhealthy, even addictive behaviors once poured into candy. With corn and with coca, the dose makes the poison, as Paracelsus put it. And in the modern era of “food science,” dozens of analysts may be spending millions of dollars just to perfect the “mouthfeel” and flavor profile of a single brand of chips.1

Should we be surprised, then, that Americans are losing the battle of the bulge? Indeed, the real wonder is not that two-thirds of the US population is overweight, but that one-third remains “normal,” to use an adjective that makes sense only in relation to an earlier era’s norms.2

For many technology enthusiasts, the answer to the obesity epidemic—and many other problems—lies in computational countermeasures to the wiles of the food scientists.3 App developers are pioneering behavioristic interventions to make calorie counting and exercise prompts automatic.4 For example, users of a new gadget, the Pavlok wristband, can program it to give them an electronic shock if they miss exercise targets. But can such stimuli break through the blooming, buzzing distractions of instant gratification on offer in so many rival games and apps? Moreover, is there another way of conceptualizing our relationship to our surroundings than as a suboptimal system of stimulus and response?

Some of our subtlest, most incisive cultural critics have offered alternatives. Rather than acquiesce to our manipulability, they urge us to become more conscious of its sources—be they intrusive advertisements or computers that we (think we) control. For example, Sherry Turkle, founder and director of the MIT Initiative on Technology and Self, sees excessive engagement with gadgets as a substitution of the “machinic” for the human—the “cheap date” of robotized interaction standing in for the more unpredictable but ultimately challenging and rewarding negotiation of friendship, love, and collegiality. In The Glass Cage, Nicholas Carr critiques the replacement of human skill with computer mediation that, while initially liberating, threatens to sap the reserves of ingenuity and creativity that enabled the computation in the first place.5

Beyond the psychological, there is a political dimension, too. Legal theorist and Georgetown University law professor Julie Cohen warns of the dangers of “modulation,” which enables advertisers, media executives, political consultants, and intelligence operatives to deploy opaque algorithms to monitor and manipulate behavior. Cultural critic Rob Horning ups the ante on the concerns of Cohen and Turkle with a series of essays dissecting feedback loops among surveillance entities, the capture of important information, and self-readjusting computational interventions designed to channel behavior and thought into ever-narrower channels. Horning also criticizes Carr for failing to emphasize the almost irresistible economic logic behind algorithmic self-making—at first for competitive advantage, then, ultimately, for survival.6

To negotiate contemporary algorithms of reputation and search—ranging from resumé optimization on LinkedIn to strategic Facebook status updates to OkCupid profile grooming—we are increasingly called on to adopt an algorithmic self, one well practiced in strategic self-promotion. This algorithmic selfhood may be critical to finding job opportunities (or even maintaining a reliable circle of friends and family) in an era of accelerating social change. But it can also become self-defeating. Consider, for instance, the self-promoter whose status updates on Facebook or LinkedIn gradually tip from informative to annoying. Or the search engine−optimizing website whose tactics become a bit too aggressive, thereby causing it to run afoul of Google’s web spam team and consequently sink into obscurity. The algorithms remain stubbornly opaque amid rapidly changing social norms. A cyber-vertigo results, as we are pressed to promote our algorithmic selves but puzzled over the best way to do so.

This is not an entirely new problem: We have always competed for better deals, for popularity, for prominence as an authority or a desirable person. But just as our metabolic systems may be ill adapted to a world of cheap, hidden sugar, the social cues and instinctive emotional responses that we’ve developed over evolutionary time are not adequate guides to the platforms on which our algorithmic selves now must compete and cooperate. To navigate them properly, we need the help of thoughtful observers who can understand today’s strategies of self-making within a larger historical and normative context....

....MUCH MORE
 

Previously on Nudge This:
Nudge This: "The Internet of Things Will Be a Giant Persuasion Machine"
Nudge This: "Yes, You’re Irrational, and Yes, That’s OK"
Behavior: We Are More Rational Than Those who Try To 'Nudge' Us