Saturday, June 1, 2024

"AI Is a Hall of Mirrors"

The most famous Hall of Mirrors, that at Versailles, has windows in the wall opposite the mirrors, looking out at the real world. Or as real as the grounds of a 17th century Royal Palace can approach.

From The New Atlantis, Spring 2024 edition:

LLMs are giving us a billion ripoffs of what we already are at a moment when we yearn for something new. 

Today we launched ChatGPT,” a chirpy Sam Altman posted on Twitter one day in the fall of 2022.

Altman went on to describe a sort of ladder of progress that humanity had just begun to climb. “Soon, you will be able to have helpful assistants that talk to you, answer questions, and give advice,” he promised. After that, “you can have something that goes off and does tasks for you.” His creation will be our eager protégé — our bright intern, piping up judiciously. Then it will be our homunculus, trusted with the grunt work.

What next? “Eventually you can have something that goes off and discovers new knowledge for you.” At the top of the ladder will be an AI capable of seeking enlightenment … and bringing it back to us. “Talk to the computer (voice or text) and get what you want, for increasingly complex definitions of ‘want’!”

There is an uneasy pairing here with the way we are used to thinking about AI. According to Altman, AI is there to serve you. It will talk to you, do for you, discover for you. It will give you what you want. It will really give you what you want, under ever “increasingly complex definitions.”

But the classic idea found in sci-fi, in both utopian fantasies of the end of human toil and doomer prophesies of the end of human existence, is that AI goes beyond us. Everyone now arguing about the shifting-goalpost vision of “general” artificial intelligence agrees that it would be something capable of beating us at our own games, cognitively and otherwise. In OpenAI’s charter, the endgame is the creation of “autonomous systems” that “outperform humans at most economically valuable work.”

What is happening here, and for whom? The answer is an impossible dream, one that has been suffusing our digital lives for a while now. Call it the for you paradox.

(Not) For You 

Here is the paradox.

First: Everything is for you. TikTok’s signature page says it, and so, in their own way, do the recommendation engines of all social media. Streaming platforms triangulate your tastes, brand “engagements” solicit feedback for a better experience next time, Google Maps asks where you want to go, Siri and Alexa wait in limbo for reply. Dating apps present our most “compatible” matches. Sacrifices in personal data pay (at least some) dividends in closer tailoring. Our phones fit our palms like lovers’ hands. Consumer goods reach us in two days or less, or, if we prefer, our mobile orders are ready when we walk into our local franchise. Touchless, frictionless, we move toward perfect inertia, skimming engineered curves in the direction of our anticipated desires.

Second: Nothing is for you. That is, you specifically, you as an individual human person, with three dimensions and password-retrieval answers that actually mean something. We all know by now that “the algorithm,” that godlike personification, is fickle. Targeted ads follow you after you buy the product. Spotify thinks lullabies are your jam because for a couple weeks one put your child to sleep. Watch a political video, get invited down the primrose path to conspiracy. The truth of aggregation, of metadata, is that the for you of it all gets its power from modeling everyone who is not, in fact, you. You are typological, a predictable deviation from the mean. The “you” that your devices know is a shadow of where your data-peers have been. Worse, the “you” that your doctor, your insurance company, or your banker knows is a shadow of your demographic peers. And sometimes the model is arrayed against you. A 2016 ProPublica investigation found that if you are Black and coming up for sentencing before a judge who relies on a criminal sentencing algorithm, you are twice as likely to be mistakenly deemed at high risk for reoffending than your white counterpart.

Whoever you are, the algorithms’ for you promise at some point rings hollow. The simple math of automation is that the more the machines are there to talk to us, the less someone else will. Get told how important your call is to us, in endless perfect repetition. Prove you’re a person to Captcha, and (if you’re like me) sometimes fail. Post a comment on TikTok or YouTube knowing that it will be swallowed by its only likely reader, the optimizing feed.

Offline, the shadow of depersonalization follows. Physical spaces are atomized and standardized into what we have long been calling brick and mortar. QR, a language readable only to the machines, proliferates. The world becomes a little less legible. Want to order at this restaurant? You need your phone as translator, as intermediary, in this its newly native land.

The algorithm has an aesthetic in real life, which you’ll recognize if you’ve ever been in what Kyle Chayka has called “AirSpace” (and you have): the sameiness of Japandi semi-midcentury modern, now shaded into ‘70s and ‘80s redux: large leafy plants, tasteful minimalism and its studied, equally samey instances of backlash; terrazzo, blond wood, concrete and soaring ceilings. Once this land was millennial pink, Edison bulbs, fiddle-leaf ferns, painted arches, velvet statement couches. It will keep evolving but you will know it regardless, from gentrifying coffee shops, exurb Airbnbs, hotel lobby makeovers, startup offices, direct-to-consumer pop-ups, or from the Instagrams of any of same. Sit inside one of these spaces, anywhere, and ask: Is it for anyone? And who am I, if this is for me?

....MUCH MORE