A year and a half ago I reviewed Yuval Noah Harari’s Homo Deus, recommending it to the entire industry with this subhead: “No one in tech is talking about Homo Deus. We most certainly should be.”...MORE
Eighteen months later, Harari is finally having his technology industry moment. The author of a trio of increasingly disturbing books – Sapiens, for which made his name as a popular historian philosopher, the aforementioned Homo Deus, which introduced a dark strain of tech futurism to his work, and the recent 21 Lessons for the 21st Century – Harari has cemented his place in the Valley as tech’s favorite self-flagellant. So it’s only fitting that this weekend Harari was the subject of New York Times profile featuring this provocative title: Tech C.E.O.s Are in Love With Their Principal Doomsayer. The subhead continues: “The futurist philosopher Yuval Noah Harari thinks Silicon Valley is an engine of dystopian ruin. So why do the digital elite adore him so?”
Well, I’m not sure if I qualify as one of those elites, but I have a theory, one that wasn’t quite raised in the Times’ otherwise compelling profile. I’ve been a student of Harari’s work, and if there’s one clear message, it’s this: We’re running headlong into a world controlled by a tiny elite of superhumans, masters of new technologies that the “useless class” will never understand. “Homo sapiens is an obsolete algorithm,” Harari writes in Homo Deus. A new religion of Dataism will transcend our current obsession with ourselves, and we will “dissolve within the data torrent like a clump of earth within a gushing river.” In other words, we humans are f*cked, save for a few of the lucky ones who manage to transcend their fate and become masters of the machines. “Silicon Valley is creating a tiny ruling class,” the Times writes, paraphrasing Harari’s work, “and a teeming, furious “useless class.””
So here’s why I think the Valley loves Harari: We all believe we’ll be members of that tiny ruling class. It’s an indefensible, mathematically impossible belief, but as Harari reminds us in 21 Lessons, “never underestimate human stupidity.” Put another way, we are fooling ourselves, content to imagine we’ll somehow all earn a ticket into (or onto) whatever apocalypse-dodging exit plan Musk, Page or Bezos might dream up (they’re all obsessed with leaving the planet, after all). Believing that impossible fiction is certainly a lot easier than doing the quotidian work of actually fixing the problems which lay before us. Better to be one of the winners than to risk losing along with the rest of the useless class, no?...
Wednesday, November 28, 2018
"When Tech Loves Its Fiercest Critics, Buyer Beware"
From John Battelle's SearchBlog, November 12: