Intel appears in startup and business literature primarily through the lens of one decision: the decision Andy Grove and Gordon Moore made in 1985 to abandon the dynamic random-access memory (DRAM) market — the business Intel had essentially invented — and bet the entire company on microprocessors. That decision, and the mental model Grove used to arrive at it, is one of the most analyzed strategic pivots in business history.
Clayton Christensen uses Intel extensively in The Innovator's Dilemma as a rare example of a large incumbent company that successfully disrupted itself. Most of Christensen's book is an argument about why incumbents fail to respond to disruption — why they are structurally unable to make the choices that would allow them to survive. Intel is the counterexample. It makes Intel's story particularly valuable for startup founders: it is a case study not just in the problem of disruption but in one of the few proven methods for solving it.
Intel Corporation was founded in July 1968 by Gordon Moore and Robert Noyce, two of the most technically accomplished people in semiconductor history. Moore had co-founded Fairchild Semiconductor and had recently published the paper that would become Moore's Law — the observation that the number of transistors on a microchip doubles roughly every two years while the cost halves. Noyce had co-invented the integrated circuit. When they left Fairchild to start Intel, they brought with them Andy Grove, a Hungarian-born engineer who had fled the Soviet invasion of Budapest at age twenty and who would eventually become Intel's most consequential CEO.
Intel's early business was memory chips — specifically DRAM, a technology for storing data that required constant electrical refresh to maintain its contents. Intel made DRAM; it made static RAM (SRAM); it made erasable programmable read-only memory (EPROM). Memory was Intel's identity. The company's first products were memory chips, its revenues came from memory chips, and its culture was defined by the technical challenges of designing and manufacturing memory chips at ever-increasing densities.
In 1971, an Intel engineer named Ted Hoff designed the Intel 4004 — the world's first commercially available microprocessor, a chip that put the core functions of a computer onto a single piece of silicon. This was a genuinely radical invention, but Intel did not, at the time, recognize it as its primary business. The 4004 was developed for a Japanese calculator company, Busicom, which had commissioned Intel to build a set of specialized chips for their product. Hoff's insight was to make the chip programmable — to design it so that the same chip could perform different functions depending on the software instructions it received. Intel negotiated back the rights to the chip design and launched it commercially, but memory remained the company's core focus.
The Intel 8086, launched in 1978, established the x86 architecture that would eventually become the standard for personal computing worldwide. IBM chose the 8086 (and its 8-bit variant, the 8088) for its original personal computer in 1981, a decision that had profound consequences for both companies and for the industry. But again, in Intel's internal accounting, microprocessors remained secondary to memory through the early 1980s.
The DRAM crisis arrived between 1982 and 1985. Japanese semiconductor manufacturers — Fujitsu, Hitachi, NEC, Toshiba — had invested heavily in DRAM production capacity with substantial government support. Their manufacturing quality was higher and their prices were lower than Intel's. Intel's DRAM margins collapsed. By 1985, Intel was losing money, its market share in DRAM had fallen from 83% in 1974 to negligible levels, and the business case for remaining in the DRAM market had disappeared.
What happened next is the heart of the Intel story.
Christensen uses Intel in The Innovator's Dilemma for two distinct purposes: as an example of a company that was itself disrupted from below (by Japanese DRAM manufacturers who competed on price and volume rather than technical sophistication), and as a rare example of an incumbent that successfully navigated a strategic inflection point by abandoning its core business.
The first part of the story — the DRAM disruption — fits Christensen's framework precisely. Japanese DRAM manufacturers entered the market at the low end, competing on products that were technically simpler but manufactured with greater consistency and at lower cost. This is the classic disruptive pattern: entrants who are worse on the dimensions that established players measure (design sophistication, custom specifications) but better on dimensions that turn out to matter more at scale (yield rates, price). Intel's engineers and managers, whose careers and identities were invested in the technical challenges of high-complexity memory design, were unable to match the process-manufacturing discipline that defined the Japanese approach. The business they had built was not designed to compete in the market the Japanese competitors were creating.
The second part — Grove's decision to exit DRAM — is the part Christensen finds most instructive, because it demonstrates that the outcome is not inevitable. Most incumbents, facing this scenario, do not exit. They fight for market share, make incremental improvements, lobby for trade protection, and gradually decline. Grove did something different.
The mental model Grove used is now famous. In his book Only the Paranoid Survive (1996), Grove describes sitting with Gordon Moore in his office in 1985, looking at Intel's collapsing DRAM business, and asking Moore a question: "If we got kicked out and the board brought in a new CEO, what do you think he would do?" Moore answered immediately: "He would get us out of memories." Grove looked at Moore and said: "Why shouldn't you and I walk out the door, come back in, and do it ourselves?"
This mental model — asking what a new CEO would do, then doing it yourself — is one of the most practically useful tools Christensen extracts from the Intel story. It solves a specific problem: the emotional and organizational inertia that prevents incumbent managers from making decisions that are obviously correct from the outside but feel impossible from the inside. If you are the person who built the DRAM business, who hired the DRAM engineers, who told the board for years that DRAM was Intel's future, the decision to abandon it requires you to admit that you were wrong and to destroy the work of your career. If you are a new CEO with none of that history, the decision is straightforward. Grove's insight was that you could access the new CEO's clarity of judgment without actually replacing yourself — by mentally stepping outside your own role and asking what an objective outsider would do.
Christensen also uses the Intel case to discuss the concept of the strategic inflection point — a term Grove coined and developed in Only the Paranoid Survive. A strategic inflection point is a moment when a fundamental change in the competitive environment makes the old strategy no longer viable. It is different from a normal business problem because it cannot be solved by doing the existing strategy better. It requires a different strategy. The DRAM crisis was not a problem Intel could have solved by making better DRAM chips. The Japanese manufacturers had a structural cost and quality advantage that Intel could not match in that business. The only viable response was to exit the business and redeploy resources into an area where Intel's specific capabilities — chip design sophistication, customer relationships with IBM and other PC makers — were genuinely valuable.
The microprocessor strategy worked. Intel's x86 chips became the standard for personal computers worldwide. The Intel Inside marketing campaign, launched in 1991, turned a component supplier into a consumer brand. The Pentium processor, launched in 1993, gave Intel a product line that dominated the PC market for two decades. By the time Grove stepped down as CEO in 1998, Intel was one of the most valuable companies in the world and the undisputed leader of the semiconductor industry.
Andy Grove's contribution to management thinking extends well beyond the DRAM pivot. He is responsible for two ideas that have become standard vocabulary in technology company management.
The first is OKRs — Objectives and Key Results. Grove developed the OKR system at Intel in the 1970s as a way of cascading the company's strategic objectives into measurable quarterly commitments at every level of the organization. The system has two components: Objectives (qualitative statements of what you are trying to achieve) and Key Results (quantitative measures of whether you have achieved it). The system was introduced to Google in 1999 by John Doerr, a former Intel employee who had learned it from Grove, and has since become the standard goal-setting framework across the technology industry.
The second is the culture of "constructive confrontation." Grove believed that intellectual disagreement, expressed directly and argued vigorously, was essential to good decision-making. Intel's management culture under Grove was notably aggressive by the standards of most companies: engineers and managers were expected to challenge each other's conclusions, to argue for their positions with data, and to accept defeat when the evidence demanded it. This culture was uncomfortable for many people and led to high turnover among those who preferred consensus. But it also produced remarkably rigorous decision-making in a technical domain where the cost of bad decisions was very high.
Grove is also associated with the phrase "Only the paranoid survive" — his argument that the appropriate default posture for a technology company is one of mild paranoia about competitive threats. Complacency, in Grove's view, is always the greater danger. The company that assumes its current position is secure will fail to notice the strategic inflection point until it is too late. The company that treats every potential threat with serious attention will sometimes overreact to non-threats, but will also catch the real ones.
Recognize a strategic inflection point before it destroys you. The DRAM crisis built gradually over several years. Intel had warning signs — falling margins, rising Japanese market share, deteriorating yields relative to competitors — that were visible before the crisis became acute. The problem was not that the data was unavailable; it was that the data was interpreted through the lens of an organization that was emotionally committed to the existing strategy. Grove's lesson is that strategic inflection points feel like ordinary business problems until they don't, and by the time they clearly don't, it may be too late.
Use the "new CEO" mental model to make hard decisions. When you are too close to a decision to see it clearly, ask what a rational outsider with no sunk costs would do. This is not a permission slip to be reckless with assets you have built — it is a tool for distinguishing between decisions that are genuinely difficult and decisions that only feel difficult because of your emotional investment in the status quo.
Cannibalize your own business before a competitor does. Intel's exit from DRAM destroyed enormous internal value — DRAM engineers lost their roles, DRAM production lines were shut down, years of investment was written off. But the alternative was to let Japanese manufacturers destroy that value externally, while also consuming the resources Intel needed to compete in microprocessors. The rule of thumb Grove derived from this experience: if someone is going to make your current business obsolete, make sure it's you.
OKRs are a tool for organizational alignment, not just measurement. The value of OKRs at Intel was not that they measured performance accurately — any measurement system can do that. It was that they forced every team and every individual to connect their work explicitly to the company's strategic priorities. When priorities changed, OKRs changed, and every team updated its commitments accordingly. In a company navigating a strategic pivot, this alignment is critical.
Paranoia is a management asset when directed correctly. Grove's paranoia was not diffuse anxiety — it was directed attention to competitive threats. He maintained systematic processes for identifying potential strategic inflection points, sought out employees who had direct contact with customers and competitors, and treated bad news as valuable information rather than something to be suppressed. The paranoia that kills companies is the paranoia about internal politics and status. The paranoia that protects them is the paranoia about external threats.
What is a strategic inflection point in Andy Grove's definition? Grove defined a strategic inflection point as a moment when the fundamental forces governing a business change so profoundly that the old strategy no longer works. It is characterized by a tenfold change in at least one of the key competitive factors — the strength of a new entrant, the pace of technological change, or the preferences of customers. The DRAM crisis was a strategic inflection point for Intel because Japanese manufacturers had achieved a tenfold improvement in manufacturing cost efficiency relative to Intel.
Did Intel ever return to the memory business? Intel never returned to DRAM as a significant business. It has manufactured various forms of specialized memory over the years, including NAND flash and 3D XPoint memory (branded as Optane), but these have been niche products rather than core business lines. Intel's identity and competitive position have been defined by microprocessors since the mid-1980s.
What happened to Intel after Grove stepped down? Intel under subsequent CEOs Craig Barrett, Paul Otellini, Brian Krzanich, and Bob Swan struggled to maintain the dominance Grove had established. The company missed the mobile chip market, ceded significant share in data center processors to AMD, and fell behind TSMC in manufacturing process technology. In 2021, Pat Gelsinger returned to Intel as CEO (he had been an Intel engineer under Grove) and launched IDM 2.0, a strategy to rebuild Intel's manufacturing capabilities and compete with TSMC and Samsung as a contract chip manufacturer.
Where did OKRs come from before Google? OKRs were developed by Andy Grove at Intel in the 1970s, building on the Management By Objectives (MBO) framework developed by Peter Drucker. Grove simplified and operationalized Drucker's system, adding the Key Results component that made objectives measurable. John Doerr, who joined Intel in 1974 and worked directly with Grove, later introduced OKRs to Google in 1999 and wrote the definitive modern account of the system in his book Measure What Matters (2018).