Thursday, August 22, 2024

"America Needs Intel Economically and Politically—But Is It Too Late?" (INTC)

From the Institute for New Economic Thinking, August 12:

With Pat Gelsinger at the helm, Intel’s fate will be decided by whether it can revive its innovation-driven legacy or remain a cautionary tale of financial mismanagement.

Intel’s story is a textbook case of how chasing short-term gains can lead to long-term disaster. Once a titan of innovation, Intel now struggles under the weight of its own financial myopia. As the company faces a major overhaul, let’s assess its troubled trajectory and why America’s future may well hinge on its success.

Riding a Revolution

When the Intel Corporation was founded on July 18, 1968, by Robert Noyce and Gordon Moore, most people had never seen a computer — bulky, expensive things confined to research labs, universities, and large corporations.

The semiconductor revolution was about to change that.

In the Cold War era, semiconductors—crucial for controlling electrical currents—were evolving fast. The breakthrough came with the 1947 transistor by John Bardeen, Walter Brattain, and William Shockley, ditching vacuum tubes for sleek electronics. (Shockley, despite his Nobel, became infamous for his toxic views on race and eugenics.)

By the late 1950s, Jack Kilby and Noyce, working separately, pushed the field further with the integrated circuit, enabling the packing of multiple components into one chip. Intel cofounders Noyce and Gordon Moore—of Moore’s Law fame—set their sights on making electronics increasingly smaller, cheaper, and faster. Their work led to explosive advancements in tech.

Intel’s big moment came in 1971 with the Intel 4004 microprocessor, a revolutionary chip that put a CPU (Central Processing Unit) onto a single piece of silicon. At first, many thought it was impossible – a whole computer on a chip? But the 4004 shattered old views. In November 1971, Popular Electronics hailed it as a “giant leap forward in computing.” This advance supercharged the power and efficiency of electronic devices, eventually catapulting Intel from a startup into a powerhouse of the tech world.

Intel’s breakthrough didn’t just change how people interact with technology—it kicked off the era of affordable computing. The 4004’s influence went far beyond its debut, sparking innovations that made powerful, budget-friendly tech accessible to the masses.

So far, so good. Intel was doing what we want a company to do: making things people need at prices they can afford.

In the 1980s, Intel solidified its tech prowess with a string of crucial innovations. The 8086 and 8088 microprocessors became the backbone of the personal computer boom, especially with IBM’s PC. Intel then ramped up with the 80286 and 80386 chips, dramatically boosting computing power and performance. Their strategic investment in semiconductor tech and microprocessor leadership made Intel the go-to for PC makers and paved the way for its long-term industry dominance.

Up to this point, Intel’s business model was focused on using profits for innovation and reinvesting in R&D. Situated in Silicon Valley’s hotbed of entrepreneurship and free-market zeal, Intel saw itself as a torchbearer of technological progress and business acumen. The company poured money into new products and fine-tuning manufacturing processes, and it paid off. By running its own manufacturing and prioritizing technological advancement over quick profits, Intel didn’t just participate in the tech boom—it helped to define it.

Then something began to change. The curtain was rising on an age where greed took center stage.

From Tech Dreams to Wall Street Schemes

In the late 1990s and early 2000s, Intel stumbled as America’s obsession with financialization and shareholder value took over—a trend studied by economist and business historian William Lazonick. The new philosophy: companies were defined not by their products and leadership, but by how much they could enrich shareholders – and how fast they could do it.

Big companies, driven by short-term profits, mirrored this mindset. Intel, once a tech leader, shifted focus from cutting-edge R&D to boosting shareholder returns. The name of the game was stock buybacks — basically shoveling cash to shareholders by manipulating the company’s stock price. The impact on the quality of Intel’s products became glaringly apparent with its 1999 “Netburst” architecture, which promised high speeds but delivered overheating and performance issues. It was a flop.

Things began to go downhill because, as Lazonick’s work on shareholder value ideology has repeatedly shown, financialization and innovation do not go well together.

“From 2001-2020, Intel blew $128 billion on buybacks (64% of net income) on top of paying out $68 billion as dividends (35% of net income),” notes Lazonick. That’s money that couldn’t go into innovation, retaining and training employees, R&D, and other critical areas.

With executives and shareholders set on squeezing every last dollar from the company for themselves, the idea of putting Intel’s long-term health at the forefront of decision-making became a quaint relic of the past—like those old room-sized mainframe computers....

....MUCH MORE

Our most recent visit to INET was with the good Professor himself:

Professor Lazonik on Tesla and General Motors (GM; TSLA)

Possibly also of interest:

My Little Crony: Intel, The Buyback Scam And $19.5 Billion From The Chips Act (INTC)

"Managing Decline: The Economy of Value Extraction" 

"With CHIPS Act, US Risks Building a White Elephant" (incentives matter)"Vampires at the Gate? (Finance and Slow Growth)"

  "Share Buybacks and the Contradictions of 'Shareholder Capitalism'”

 Taibbi: "The S.E.C. Rule That Destroyed The Universe" 

 Harvard Business Review Announces "The Best Management Article Of 2014" (it's by Lazonik)