This is the fourth entry in a multipart series. You may be interested to read about Shockley Semiconductor, Fairchild Semiconductor, and the start of Intel.
At the end of 1974, Intel had announced plans for a reorganization following layoffs, and that was carried out between the time of the announcement and the first months of 1975. The company was starting the year with no debts, clear ownership of their buildings and equipment, a decent cash reserve, a large line of credit, and competent leadership. Robert Noyce was the chairman, Arthur Rock vice chairman, Gordon Moore president and CEO, and Andy Grove executive vice president. The man who had previously made Intel’s microprocessors a success, Federico Faggin, had left to found a competing company who’s primary focus was microprocessors, and these chips were intended to be compatible with the popular Intel 8080. 1974 also saw the company open up its design center in Haifa, Israel, and its assembly center in Manila, Philippines.
For 1975, the company increased its R&D spend by 39%, decreased its unit costs with higher production volumes, and they were barely able to sell enough to keep up revenues. The company also increased their marketing and administrative costs. The main driver of revenue increases came in the fourth quarter with 4K DRAM chips, and particularly the 2104. It shipped in a standard 16-in package, and had an access time of 350ns, and a cycle time of 500ns.
Intel’s location in Penang had many attributes that Andy Grove found favorable when he chose it as Intel’s first international location in 1972. First, the geography played a role. The spot, a former rice field on an island in the Strait of Malacca, was easy to access, had a rather stable temperature all year, had a low earthquake risk (for that region anyway), and the biggest apparent risk was flooding during monsoons. Surely, the Malaysian government’s openness to outside investment didn’t hurt Grove’s decision at all, but the island had plenty of people so Grove didn’t foresee any staffing issues either. At about 09:00 on the 1st of May in 1975, a shorted lighting circuit started a fire in the assembly shop, within an hour the entire factory was conflagrant. Only the cafeteria remained, and of the rest, not even the steel that comprised the framing remained. Yet, God was merciful. The day of the fire, the plant was closed, and no one was physically harmed. The fire did, however, hurt Intel. The cost of the fire to the company was $2.5 million, and without the location operating Intel struggled to fulfill all of its obligations. The site manager, Ken Thompson (not the UNIX Ken Thompson), and the production director, Gene Flath, worked overtime to try and restore the location. According to Flath, work at the site during the rebuild was underway 22 hours every day. In an effort to keep Intel’s overall production steady, five other locations began phasing in an overnight shift (as opposed to just day and evening shifts). To me, this is an absolutely amazing achievement. This sudden and massive logistics change would be incredibly difficult to manage for any company. The Penang location would resume normal operations in 1976 thanks to the dedication of the team there.
With the success of the Intel 8080 continuing, the company wanted to produce something that would compete with mainframe systems of the time. This task was initially undertaken by Fred Pollack, but it had considerable support from John Doerr, Dave Best, and Casey Powell. Initially called the 8800, this new architecture was meant to support data abstraction, typing, and capability-based addressing in silicon with the idea being to unify the hardware and software of the system around a single set of concepts, objects, with a language similar to ALGOL replacing assembly languages with which we are more familiar today. From what I can tell, the development of the 8800 shifted to the management of Bill Lattin with Justin Rattner as the lead engineer before the project changed its name to Intel Advanced Performance Architecture 432, or iAPX 432. This was likely not long before the project moved from Santa Clara to Portland in March of 1977. If anyone has more clarifying information on this early part, please let me know. The physical structure of the 432 was realized in three separate chips: the 43201, 43202, and 43203. The 43201 handled fetch and decode while the 43202 handled execution, and the 43203 was the interface processor. These were all in 64-pin quad in-line packages, and they were manufactured with HMOS. Each managed to consume less than 2.5 watts from a single 5 volt supply. As the 32 in 432 would seem to suggest, these were 32bit parts. During these early days of development that goal was for 10MHz.
Intel’s revenues for 1975 stood at $136,788,000 for an increase of 1.7% but profits fell 17.7% to $16,275,000. Most the profits really came in the last two quarters.
The year of 1976 was an interesting one at Intel. The company’s memory products were continuing to do well in the market, with the 16K DRAM chip, 2116, being the the company’s star at the time. For those needing speed over quantity, the 2147 SRAM of 4K was available offering access and cycle times of just 70ns. The company’s various controllers, ROMs, and PROMs were also continuing to sell. While Intel didn’t consider itself a processor company, the 8080 was an important product for them. Not only did it sell, but having the most popular CPU in their stable gave the company a certain prestige.
Intel was aware of Faggin’s new company, Zilog, and when the first Z80 samples came back on the 6th of March and the launch time was set for May, Intel realized they had a bit of a problem. The iAPX 432 wasn’t ready. As the most complex of any CISC chip ever conceived, it was seeing significant delays and roadblocks. Intel needed something that was compatible with the 8080 but capable of higher performance than either the 8080 or the reportedly better performance of the Z80. This product was intended to keep Intel relevant in the CPU industry until such time as the 432 was ready. This new microprocessor was the 8086, and development began in May of 1976. When Steve Morse began the design, he had two primary goals. The first was to increase processing throughput, and the second was to maintain some compatibility with the 8080. For performance, the 8086 would be 16bit and support more memory (among other changes). To address that compatibility aim, he included the 8080’s register set and instruction set as logical subsets of the 8086’s registers and instructions. While a program would need to be translated and reassembled, this wasn’t as rough as a complete rewrite. Today, many people do not view compatibility at this level to be all too important. Modern machines are powerful enough that dynamic recompilers and translators are often sufficient for compatibility, and much software is delivered in languages that aren’t machine dependent. Even more, many of the programs relied upon today are web-based where the browser is more important than the underlying machine. In the 1970s, none of this was the case. Hardware and software were both expensive, and programmers’ time was quite expensive as well. The 8080 had created a standard that allowed for an economy of scale, and this was an advantage that Intel didn’t want to sacrifice. Even their primary competitor, Zilog, was capitalizing on that existing market, and their next major competitor, Motorola, wasn’t yet a major threat. Despite the heavy focus on compatibility, the 8086 wasn’t simply a 16bit version of the 8080. Conditional call and return instructions were removed and conditional jumps were expanded, and the RST from the 8008 was removed with interrupt vectors being added. The 8086 then added support for signed integers, base plus offset addressing, memory segmentation allowing up to 1MB of memory, and self-repeating operations. With the initial architecture design ready, Morse was assisted with revisions by Bruce Ravenel. The hardware and logic design were handled by Jim McKevitt and John Bayliss. Bill Pohlman was the engineering manager, and the product manager was Jeff Katz.
Following on the Intellec series, Intel offered the industry’s first SBC, the iSBC 80/10 in 1976. The goal with this product was to offer the most cost-effective solution for OEMs possible. The board included an 8080A CPU, 1K RAM, sockets for 4K ROM or PROM, 48 programmable I/O lines, an RS232 interface, and bus drivers for both memory and I/O expansion.
In March of 1976, Intel released the 8085. This was an 8bit microprocessor made of 6500 transistors on a 3 micron process using a single 5 volt supply. It was compatible with the 8080, ran at 3MHz, and came in a 40-pin DIP. It could address 64K RAM, and required just three ICs to make a complete system (8085 CPU, 8155 RAM, 8355 ROM) as it incorporated the clock generator and system controller.
Late in the year, Intel announced the MCS-48 series of microcontrollers. The 8020 at the low end, and the 8749 at the high end. These were designed primarily by David Stamm and Henry Blume Jr with DTL and TTL components on a breadboard, and with pen and drafting paper. As a demo, the two used copper and zinc strips and some citrus fruits (lemon or orange) to create a battery to power an 8748 making the demonstration a bit more interesting than it would have been otherwise. The first three models were the 8035, 8048, and 8748. These chips provided 64 to 256 bytes of RAM, internal ROM (optional), and I/O with its own address space. With those modest specifications, it will be no surprise that at the chips’ maximum clock of 11MHz they achieved just 0.73 MIPS. That figure is also overly optimistic as about a third of the available instructions required two cycles, and common performance was thus closer to 500,000 instructions per second. Of the entire line, the 8048 (1K ROM, 64 bytes RAM, 27 lines I/O) and 8748 (IK PROM, 64 bytes RAM, 27 lines I/O) are the most famous with the 8048 showing up in the Odyssey2, Korg Trident, Korg Poly-61, Roland Jupiter-4, and Roland ProMars. The Sinclair QL made use of an 8049 (2K ROM, 128 bytes RAM, 27 lines I/O) for keyboard, joystick, RS-232, and audio output. Many more machines would use an MCS-48 for various functions. Microcontrollers such as these weren’t the most interesting products in the world, but they radically changed the world. These devices enabled automation in places that wouldn’t have been economically viable in the past, and they also brought down the cost of computers and electronics more generally. Instead of bespoke circuits with large boards, a single chip with a program in ROM could be used. Design times were shorter, material costs lower, and reliability far greater.
The Intel annual report for 1976 starts with an interesting note: Intel shipped individual devices that year that contained more transistors in each one than were in use globally for the twenty years before Intel’s founding. That is, were one to count all transistors in use from 1948 to 1968 from all over the world, there were more transistors than that in a single product from Intel (such as the iSBC 80/10). This is quite the flex for their production prowess. The importance of this year for the industry couldn’t have been known at the time, but it was an amazing year for this particular Fairchild spin-off. The recession ended, the organization’s head count grew to 7347, the company invested $32.1 million in buildings and equipment, and they still managed to increase their cash holdings. Revenues for the year were $225,979,000 (almost double their total assets) with profits of $25,214,000.
Intel’s 4K RAMs continued strong sales in 1977 though yields for their 16K RAMs couldn’t meet demand. EPROMs saw expanded demand and sales that year, and Intel introduced a 16K EPROM (2716). A major advancement for industry came in the Intel 2910. This was a single chip codec that allowed for multiple simultaneous transmissions on a single phone line rather than a dedicated line for every call. On top of these products, the company invested even more money in their facilities and equipment at $45 million, and the company organized itself into five disctinct and self-contained divisions each of which had its own development, engineering, marketing, and specialized manufacturing capacities. Naturally, where common requirements existed (semiconductor fabrication, packaging) those were still shared at common fab and packaging sites. The company also expanded geographically once again. In its nine years of operation, manufacturing and sales were the only activities that didn’t take place in the San Francisco Bay area (with the exception of R&D at Haifa). This changed with engineering and marketing at Portland. Intel closed the year with income of $31,716,000 on revenues of $282,549,000.
The biggest event of 1978 at Intel was the launch of the 8086 microprocessor on the 8th of June. Just 18 months after work on the chip began, it made its way into the world. As noted previously, this was Intel’s 16bit processor. Not mentioned previously, this was first of theirs to use microcode. Then, there’s that 1MB of maximum RAM enabled by segmentation. To make this work efficiently, each memory access involved the addition of a segment register and offset register by a dedicated adder to keep the ALU free. The ALU didn’t do hardware multiplication or division, but it did adds and subtracts, boolean logic, and shifts. The chip also introduced instruction prefetching where an instruction was grabbed from memory prior to its actually having been needed. This was implemented in the bus interface unit. The first 8086 CPUs were comprised of 29,000 transistors on a 3 micron HMOS process, and ran at 5MHz. The 8086 was 16bit, and shipped in a 40-pin DIP. While Intel certainly intended this product to be available as a standalone CPU, it was also intended to be used as an attached processor in the iAPX 432 as a part of the logical I/O processor.
Of course, releasing just a CPU wouldn’t help anyone. A CPU needs to have support chips to allow it to be truly useful. This was one area where Intel delivered quite well. The 8086 could be paired with the 8237 DMA controller, 8251 USART, 8253 programmable interval timer, 8255 PPI, 8259 PIC, 8279 keyboard/display controller, 8282 or 8283 8bit latch, 8284 clock generator, 8286 or 8287 bidirectional 8bit driver, 8288 bus controller, 8289 bus arbiter, and 8272A floppy controller.
With the 8086 on the market, Intel began work on the 80286. The design was large enough and complicated enough that Intel had an employee tasked solely with maintaining an indexed box of cards that tracked which plans were on what database. Over in Haifa, Intel engineers were at work to make a cost reduced version of the 8086 and its support chips.
Intel’s 4K and 16K DRAMs continued to gain market share, static RAMs continued for those requiring high speed memory, and ROMs continued to sell. While overshadowed by the 8086, Intel released a 32K EPROM in 1978. Naturally, Intel released the iSBC 86/12 with the 8086 as its processor, and the Intellec Series II which could be had with an 8086, 8085, 8080, 8048, or 8051. Intel closed 1978 with income of $44,314,000 on revenues of $400,620,000.
1979 saw Intel making still more investments in equipment and facilities, the Systems Division relocated to Phoenix, HMOS-II was rolled out yielding speed improvements of about 30% over the original HMOS, Intel Magnetics produced a million-bit bubble memory chip, and Intel’s team in Haifa produced the 8088. The Intel 8088 was effectively identical to the 8086 but it used an 8bit data bus. This allowed the 8088 to be used with much cheaper 8bit support ICs at the cost of performance. WIth an 8bit bus, the 8088 required two cycles to get 16bits of data from memory. The prefetch queue was also shortened to just four bytes from six. The Intel 8089 I/O coprocessor was also introduced in 1979. In executive leadership, Gordon Moore became chairman of the board while remaining CEO, Bob Noyce became vice chairman, and Andy Grove became president and COO. They closed the year with $662,996,000 in revenues and $77,804,000 in profits. For all of these early years, it is important to note that taxes have been effectively cutting the income number in half.
Up to this point, microprocessors were not Intel’s business. Intel was a memory company first, second, and last. For them, their CPUs were a way to sell more memory. After all, there aren’t as many mainframes or minis as there could be micros, and all of these systems needed memory. Still, the 8080 had been making money whether or not people used Intel memories. Money making was something the 8086 wasn’t doing. Sales for the 8086 didn’t meet expectations, and both the Zilog Z8000 and the Motorola 68000 launched in 1979. These competing chips were more powerful than the 8086 in some of their variants such as the Z8001 which could address more memory (8MB), clocked higher, and had more registers. The 68000 didn’t use memory segmentation, offered even more memory than the Z8001 (16MB), and had a 32bit design with a 16bit data bus. It was an easy transition for programmers who’d become accustomed to VAX, and it clocked higher. While Motorola had never been a real threat to the 8080, the 68000 was objectively more powerful than the 8086 and could potentially take serious market share. Motorola was also a much larger company than Intel with more resources. Zilog, on the other hand, had handily taken the market with Z80, and Intel had good reason to fear Faggin and Shima’s next chip. Both of these chips were getting design wins, and Intel was in third place for the year. The answer to these two serious and competent competitors was Operation Crush.
Operation Crush really began in November when Don Buckout sent a message (eight-page telex) to Intel’s senior management which informed them that they were consistently losing business and that there was no way to win with the 8086. Bill Davidow who ran the microcomputer systems division copied the telex, handed it out at the next executive meeting less than a week later, and added fire to the issue. He stated that the microprocessor team was killing his business unit as well. Andy Grove looked at him and told him to solve the problem. Shortly thereafter a group (Dave House, Jim Lally, Regis McKenna, Jeff Katz, Richie Bader, Casey Powell, Don Buckhout, Bill Davidow) met to design the response. Jim Lally gave posterity quite the quote:
Are we doing this as an exercise to improve our position in the marketplace, and achieve the recognition that we deserve or are we doing this to fucking kill Motorola? Because the other guys are so stupid that they won’t make any difference at all. The Z8000 is nothing because Zilog doesn’t know what it’s doing. The 16016 is even more of a joke because National doesn’t know what it’s doing. There’s only one compay competing with us, and that’s Motorola. The 68000 is the competition…. We have to kill Motorola, that’s the name of the game. We have to crush the fucking bastards. We’re gonna roll over Motorola and make sure they don’t come back again.
As Lally not so eloquently stated, Zilog had some serious issues. Without the Zilog 8010 memory controller, the Z8000 was essentially limited to just 64K RAM, and the 8010 wasn’t available at the time of the CPU’s introduction. What he referred to as the National 16016 would be more recognizable today as the NS32000 series of CPUs, or the Swordfish micontrollers. These chips were meant to be single chip implementations of a VAX-11. They were, however, severely delayed and not available at the time. This meant that Motorola was really the primary competitor as they saw it, and it was Motorola who had the most design wins. If you must compete, compete for first place. They were out to crush the competition.
A quick note about the name. The name for Operation Crush came from Lally’s love of the Denver Broncos who wore orange and had the Orange Crush defense, love for the orange Crush soft drink, and the notion of crushing one’s competition. The legal department didn’t really like any of this.
The key strength that Intel had in this effort was their supporting chip ecosystem. If a company chose the 8086, they immediately got access to all of the support chips and those chips were available in quantity even if not the best possible technology of the time. A chip that may be poor is better than a chip that doesn’t actually exist. If a company chose Motorola or Zilog, they’d need to invest a significant amount of time, money, and expertise in building out their system. Intel also had software to help. They had compilers, debuggers, and in-circuit simulators. In short, if a company chose the 8086 their time to market would be considerably shorter, and their design costs would be considerably lower. With this sober analysis of the company’s strengths and weaknesses, it became obvious to them that it was time to sell to CEOs and not to programmers. The resultant sales and ad campaign was the largest that Intel had ever run at the time, and it cost the company $2 million. To encourage their sales team, Intel offered a trip to Tahiti for the teams (and spouses of those team members) who met their respective quotas, and in the end, all of the sales force went to Tahiti which was about another $2 million. To enhance the pressure to win, Intel’s management sent Tahitian tourism brochures to the sales teams’ spouses.
The most critical part of the entire effort came down to people being willing to sacrifice their own departmental prestige and ego for the sake of the company-wide effort. This isn’t normal in any bureaucratic organization. Grove and Noyce addressed the company and said this was the mission, and the entire company pivoted. A company with revenues over $2.5 billion in 2025 dollars changed directions completely in under two weeks. The second major thing that allowed this switch was that several Japanese companies had entered the memory business and they were driving prices down on an already low margin product. While the company’s numbers weren’t yet showing the struggle, people in the business were acutely aware of it. Within the memory business, Intel was now only the leader in static RAMs and EPROMs.
As part of Crush, Intel held seminars on their products, and they had a futures catalog that provided some data on the next generations of Intel’s 8086-compatible CPUs, namely the 186 and 286. The idea was that should one choose Intel now, he or she would have a decade of products ready to go. Intel was no longer selling just a CPU, they were selling an entire architecture of CPUs and the associated support chips.
Further, the Crush team began to realize that Zilog couldn’t keep up in manufacturing, they couldn’t keep up in the number of products they could manage, and they couldn’t match the sales force or support that Intel could offer. Meanwhile, given Motorola’s size, they couldn’t pivot as quickly, and Motorola had a single sales force that couldn’t focus on microprocessors alone. All of this worked in Intel’s favor. Operation Crush had a goal of 2000 design wins; they achieved more than 2300.
Given that the 8086 was rather slow at floating point, Intel released the 8087 coprocessor in 1980. Depending upon the operation, an 8087 was able to perform arithmetic anywhere from 50% to 500% faster than an 8088 or 8086 alone. The release of the 8087 was the culmination of work that had begun in 1977 when Pohlman had initially conceived of the chip. John Palmer served as the mathematician for the project, Ravenel was the architect, and the project was later shifted over to Haifa where Rafi Nave led the implementation effort. Robert Koehler and John Bayliss were responsible for figuring out how to get instructions offloaded to the coprocessor from the central processor.
With increased competition in memories, a sluggish global economy, and all of Intel’s prior investments in production coming to fruition, Intel had a problem with oversupply. Despite these factors, Intel posted proft of $96,741,000 on revenues of $854,561,000.
On the 12th of August in 1981, IBM launched the PC with the Intel 8088 at its heart. IBM’s choice of the 8088 was the combination of several factors. The PC needed to be cheap enough for small businesses and individuals to purchase, and this would easily explain the choice of the 8088 over the 8086. Not only was the 8088 cheaper, all of its support chips were too. Then, there’s the IBM directive that the PC be made with off-the-shelf components. For Zilog or Motorola, this wouldn’t have been as easy; they didn’t have the large number of support chips. Further importance was given to those support chips as the IBM project was only given a year, so design time was scarce. Proof of Intel’s delivery of a complete ecosystem of components is demonstrated best in that the design for the PC’s motherboard was completed in just 40 days. Intel had a history of successful volume delivery, and I can only imagine this too played a large part in IBM’s decision making process. Intel’s success with the 8080 that had already created a standard around CP/M, VisiCalc, and WordStar no doubt also helped considering Intel’s automated translation tooling.
At launch, the IBM PC 5150 started at $1565. The base model included the 8088 clocked at 4.77MHz, 16K RAM, and the best keyboard ever made by humankind. With this base model, users of the computer were expected to use cassette tapes for data storage. In my life, I have never heard of, read about, or met a single person who ever used cassettes to load or save data on an IBM PC. For those with deeper pockets, the addition of a 5.25 inch floppy disk drive and a monochrome monitor would raise the price to $2880. For those who wished to part with their money as much as was possible, a 5150 with 256K RAM, a color monitor, and dual floppies would have cost around $6000. In all cases, the PC had 5 expansion slots, but the 5161 expansion chassis was also available providing space for two winchester disks and 8 more expansions slots. For these first machines, those prices did not include any software other than BASIC in ROM, but IBM was more than willing to sell their customers some software. The PC is always associated with the rise PC-DOS, and at launch this was the only OS available. CP/M-86 and UCSD p-System became available shortly after. All of these specifications changed rather quickly as IBM released upgraded versions of the PC.
Within four months, IBM sold 65,000 PCs. The PC platform, and with it the 8088/8086, swiftly became a standard around which many other companies would bloom. By the end of the year, around 5000 products had been designed around the 8086/8088. Intel had conquered the market, but they may not have known that in 1981. Intel’s revenues were down to $788,676,000 and profits down more significantly to $27,359,000. Much of this was largely due downward pressure on memory prices and still more competition in that market, but some was also due to capex of $154 million.
I now have readers from many of the companies whose history I cover, and many of you were present for time periods I cover. A few of you are mentioned by name in my articles. All corrections to the record are welcome; feel free to leave a comment.
The 432 used Algol 68 as its assembly language, not Algol. They are very different languages. Also, at the very end, I had to Google "capex," a term I wasn't familiar with. (Capital expenditures.)