I had a similar reaction: "What's with the 'tens of billions'" when I saw the news yesterday, probably because I had just read Carl Sagan swear he never said "billions and billions."
From The Register, October 23:
But AWS is still the AI upstart's primary partner
Google and Anthropic have struck a deal that will see the AI upstart gain access to up to a million of the web giant’s tensor processing units (TPUs) and involve “tens of billions of dollars.”
The two companies announced the deal on Thursday, with Anthropic pitching it as “expanded capacity” that the company will use to meet surging customer demand and allow it to conduct “more thorough testing, alignment research, and responsible deployment at scale.”
Google’s take on the deal is that it will enable Anthropic to “train and serve the next generations of Claude models,” and involves “additional Google Cloud services, which will empower its research and development teams with leading AI-optimized infrastructure for years to come.”
The search and ads giant claims the deal “represents the largest expansion of Anthropic's TPU usage to date” and says the AI upstart “chose TPUs due to their price-performance and efficiency, and the company's existing experience in training and serving its models with TPUs.”
Anthropic’s announcement points out it’s betting on companies other than Google.
“Anthropic’s unique compute strategy focuses on a diversified approach that efficiently uses three chip platforms – Google’s TPUs, Amazon’s Trainium, and Nvidia’s GPUs,” the statement explains. “This multi-platform approach ensures we can continue advancing Claude's capabilities while maintaining strong partnerships across the industry.”
Anthropic also stated: “We remain committed to our partnership with Amazon, our primary training partner and cloud provider, and continue to work with the company on Project Rainier, a massive compute cluster with hundreds of thousands of AI chips across multiple U.S. data centers.”
AI infrastructure announcements nearly always feature enormous numbers and breathless enthusiasm about how the tech will improve everything … but little detail about how players will pay for, and profit from, the billions spent on giant chip farms....
....MUCH MORE, including four 'More Context' links.
Recently - OpenAI turns to Google's AI chips to power its products, source says (GOOG; NVDA)
The Google product has been a long time acoming. Some previous posts:
June 2016 - Machine Learning: JP Morgan Compares Google's New Chip With NVIDIA's (GOOG; NVDA)
April 2017 - Watch Out NVIDIA: "Google Details Tensor Chip Powers" (GOOG; NVDA)
May 2018 - Ahead of Today's NVIDIA Earnings: A Look at One of the Competitors (GOOG; NVDA)
We'll have much more on Google next week but for today the next-gen Tensor Processing Unit...
June 2023 - "Elon Musk Predicts Nvidia’s Monopoly in A.I. Chips Won’t Last" (NVDA; TSLA)
September 2023 - Chips: "Google TPU v5e AI Chip Debuts after Controversial Origins"
July 2024 - Apple Trained Its Large Language Model On Broadcom-Designed TPUs In Google's Cloud, Meh (AVGO, GOOG, AAPL)
And many, many more, use the 'search blog' box, upper left, if interested.
Google of course has its own AI model entrants, Gemini among others and so, of course, is not giving Mr. Altman their best chips.
And Professor Sagan?
I never said it. Honest. Oh, I said there are maybe 100 billion galaxies and 10 billion trillion stars. It’s hard to talk about the Cosmos without using big numbers. I said "billion" many times on the Cosmos television series, which was seen by a great many people. But I never said "billions and billions." For one thing, it’s too imprecise. How many billions DUH "billions and billions"? A few billion? Twenty billion? A hundred billion? "Billions and billions" is pretty vague. When we reconfigured and updated the series, I checked—and sure enough, I never said it.
But Johnny Carson—on whose Tonight Show I'd appeared almost thirty times over the years—said it.
He'd dress up in a corduroy jacket, a turtleneck sweater, and something like a mop for a wig. He had created a rough imitation of me, a kind of Doppelganger, that went around saying "billions and billions" on late-night television. It used to bother me a little to have some simulacrum of my persona wandering off on its own, saying things that friends and colleagues would report to me the next morning. (Despite the disguise, Carson—a serious amateur astronomer—would often make my imitation talk real science.)
Astonishingly, "billions and billions" stuck. People liked the sound of it. Even today, I'm stopped on the street or on an airplane or at a party and asked, a little shyly, if I wouldn't—-just for them—say "billions and billions.""You know, I didn't actually say it," I tell them.
"It's okay," they reply. "Say it anyway."I'm told that Sherlock Holmes never said, "Elementary, my dear Watson" (at least in the Arthur Conan Doyle books); Jimmy Cagney never said, "You dirty rat"; and Humphrey Bogart never said, "Play it again, Sam." But they might as well have, because these apocrypha have firmly insinuated themselves into popular culture.
I'm still quoted as uttering this simple-minded phrase in computer magazines ("As Carl Sagan would say, it takes billions and billions of bytes"), newspaper economics primers, discussions of players' salaries in professional sports, and the like.
For a while, out of childish pique, I wouldn't utter or write the phrase, even when asked to. But I've gotten over that. So, for the record, here goes: "Billions and billions."....