From Racked:
The message of many things in America is “Like this or die.”— George W.S. Trow, Within the Context of No Context, 1980
The Seeing Robot
The camera is a small, white, curvilinear monolith on a
pedestal. Inside its smooth casing are a microphone, a speaker, and an
eye-like lens. After I set it up on a shelf, it tells me to look
straight at it and to be sure to smile! The light blinks and
then the camera flashes. A head-to-toe picture appears on my phone of a
view I’m only used to seeing in large mirrors: me, standing awkwardly in
my apartment, wearing a very average weekday outfit. The background is
blurred like evidence from a crime scene. It is not a flattering image.
Amazon’s
Echo Look, currently available by invitation only but also
on eBay,
allows you to take hands-free selfies and evaluate your fashion
choices. “Now Alexa helps you look your best,” the product description
promises. Stand in front of the camera, take photos of two different
outfits with the Echo Look, and then select the best ones on your
phone’s Echo Look app. Within about a minute, Alexa will tell you which
set of clothes looks better, processed by style-analyzing algorithms and
some assistance from humans. So I try to find my most stylish outfit,
swapping out shirts and pants and then posing stiffly for the camera. I
shout, “Alexa, judge me!” but apparently that’s unnecessary.
What I discover from the Style Check™ function is as
follows: All-black is better than all-gray. Rolled-up sleeves are better
than buttoned at the wrist. Blue jeans are best. Popping your collar is
actually good. Each outfit in the comparison receives a percentage out
of 100: black clothes score 73 percent against gray clothes at 27
percent, for example. But the explanations given for the scores are
indecipherable. “The way you styled those pieces looks better,” the app
tells me. “Sizing is better.” How did I style them? Should they be
bigger or smaller?
The Echo Look won’t tell you why it’s making its
decisions. And yet it purports to show us our ideal style, just as
algorithms like Netflix recommendations, Spotify Discover, and Facebook
and YouTube feeds promise us an ideal version of cultural consumption
tailored to our personal desires. In fact, this promise is inherent in
the technology itself: Algorithms, as I’ll loosely define them, are sets
of equations that work through machine learning to customize the
delivery of content to individuals, prioritizing what they think we
want, and evolving over time based on what we engage with.
Confronting the Echo Look’s opaque statements on my
fashion sense, I realize that all of these algorithmic experiences are
matters of taste: the question of what we like and why we like it, and
what it means that taste is increasingly dictated by black-box robots
like the camera on my shelf.
Theories of Taste
In his 2017 book Taste, the Italian philosopher
Giorgio Agamben digs up the roots of the word. Historically, it is
defined as a form of knowledge through pleasure, from perceiving the
flavor of food to judging the quality of an object. Taste is an
essentially human capacity, to the point that it is almost subconscious:
We know whether we like something or not before we understand why.
“Taste enjoys beauty, without being able to explain it,” Agamben writes.
He quotes Montesquieu: “This effect is principally founded on
surprise.” Algorithms are meant to provide surprise, showing us what we
didn’t realize we’d always wanted, and yet we are never quite surprised
because we know to expect it.
Philosophers in the 18th century defined taste as a moral
capacity, an ability to recognize truth and beauty. “Natural taste is
not a theoretical knowledge; it’s a quick and exquisite application of
rules which we do not even know,” wrote Montesquieu in 1759. This
unknowingness is important. We don’t calculate or measure if something
is tasteful to us; we simply feel it. Displacing the judgment of taste
partly to algorithms, as in the Amazon Echo Look, robs us of some of
that humanity.
Every cultural object we aestheticize and consume — “the
most everyday choices of everyday life, e.g., in cooking, clothing or
decoration,” Pierre Bourdieu writes in his 1984 book
Distinction: A Social Critique of the Judgement of Taste
— is a significant part of our identities and reflects who we are.
“Taste classifies, and it classifies the classifier,” Bourdieu adds. If
our taste is dictated by data-fed algorithms controlled by massive tech
corporations, then we must be content to classify ourselves as slavish
followers of robots....
MUCH MORE