Friday, September 29, 2023

"How Will the Tech Titans Behind ChatGPT, Bard, and LLaMA Make Money?"

From Harvard Business School's Working Knowledge, September 19:

It seems like anything is possible with generative AI right now. But how will companies profit from those big ideas? Andy Wu breaks down the potentially painful tradeoffs that tech firms might face as artificial intelligence enters its next phase.

The dizzying explosion of generative artificial intelligence platforms has been the big business story of the past year, but how they’ll make money and how smart companies can use them wisely are the questions that will dominate the next 12 months.

“Students and executives are no longer asking whether we should adopt AI—but rather, when and how to do so,” says Andy Wu, the Arjun and Minoo Melwani Family Associate Professor of Business Administration at Harvard Business School.

Wu’s recent case study and background note, AI Wars and the Generative AI Value Chain, offer a crash course in ChatGPT, Bard, and other AI chatbots—as well as the dueling tech titans behind them—and probe the strategic dilemmas ahead for innovators and users. The public's fascination with the human-like aspects of chatbots may be overshadowing more fundamental questions about how companies can profit from AI, Wu says.

“I think the basic economics of a generative AI are being overlooked.”

In an interview, Wu discusses the challenging economics of AI, how business models are likely to differ from traditional software models, and some of the potentially painful tradeoffs ahead for companies such as Google, Microsoft, and others. Wu collaborated on the case study with HBS research associate Matt Higgins; HBS doctoral student Miaomiao Zhang; and Massachusetts Institute of Technology doctoral student Hang Jiang.

Ben Rand: What did you find most surprising in preparing this case and why?

Andy Wu: I think the basic economics of a generative AI are being overlooked. There are significant unanswered questions in terms of how people will actually make money with this technology. Google and OpenAI and others can't lose money in perpetuity. But it’s not yet obvious to anyone exactly how this will be monetized. At minimum, I can tell you that we are going to need new business models, and the integration of generative AI is going to transform how we monetize software and the business model.

Rand: How so?

Wu: Our notions of fixed cost and variable costs are different here than they were for any other form of computing we've lived through in the past. The key insight is that the variable cost of delivering generative AI to an end user is not zero… which means we can't necessarily be handing out future software-as-a-service applications containing generative AI for free to anyone or even as a paid subscription without usage limits as we are used to today. Usage pricing is going to be much more important.

A second distinction is that a significant portion of the core technology is open source, and a lot of the data being used to train these models is public data and may be copyrighted but is publicly available online. The barriers to entry for AI are not as high as it may seem. So many companies will be in the game, at least for specific vertical AI models and applications....