loader from loading.io

The Story of Intel

The History of Computing

Release Date: 03/07/2023

Lotus: From Yoga to Software show art Lotus: From Yoga to Software

The History of Computing

Nelumbo nucifera, or the sacred lotus, is a plant that grows in flood plains, rivers, and deltas. Their seeds can remain dormant for years and when floods come along, blossom into a colony of plants and flowers. Some of the oldest seeds can be found in China, where they’re known to represent longevity. No surprise, given their level of nitrition and connection to the waters that irrigated crops by then. They also grow in far away lands, all the way to India and out to Australia. The flower is sacred in Hinduism and Buddhism, and further back in ancient Egypt. Padmasana is a Sanskrit term...

info_outline
Section 230 and the Concept of Internet Exceptionalism show art Section 230 and the Concept of Internet Exceptionalism

The History of Computing

We covered computer and internet copyright law in a previous episode. That type of law began with interpretations that tried to take the technology out of cases so they could be interpreted as though what was being protected was a printed work, or at least it did for a time. But when it came to the internet, laws, case law, and their knock-on effects, the body of jurisprudence work began to diverge.  Safe Harbor mostly refers to the Online Copyright Infringement Liability Limitation Act, or OCILLA for short, was a law passed in the late 1980s that  shields online portals and internet...

info_outline
Bluetooth: From Kings to Personal Area Networks show art Bluetooth: From Kings to Personal Area Networks

The History of Computing

Bluetooth The King Ragnar Lodbrok was a legendary Norse king, conquering parts of Denmark and Sweden. And if we’re to believe the songs, he led some of the best raids against the Franks and the the loose patchwork of nations Charlemagne put together called the Holy Roman Empire.  We use the term legendary as the stories of Ragnar were passed down orally and don’t necessarily reconcile with other written events. In other words, it’s likely that the man in the songs sung by the bards of old are likely in fact a composite of deeds from many a different hero of the norse.   Ragnar...

info_outline
One History Of 3D Printing show art One History Of 3D Printing

The History of Computing

One of the hardest parts of telling any history, is which innovations are significant enough to warrant mention. Too much, and the history is so vast that it can't be told. Too few, and it's incomplete. Arguably, no history is ever complete. Yet there's a critical path of innovation to get where we are today, and hundreds of smaller innovations that get missed along the way, or are out of scope for this exact story. Children have probably been placing sand into buckets to make sandcastles since the beginning of time. Bricks have survived from round 7500BC in modern-day Turkey where humans made...

info_outline
Adobe: From Pueblos to Fonts and Graphics to Marketing show art Adobe: From Pueblos to Fonts and Graphics to Marketing

The History of Computing

The Mogollon culture was an indigenous culture in the Western United States and Mexico that ranged from New Mexico and Arizona to Sonora, Mexico and out to Texas. They flourished from around 200 CE until the Spanish showed up and claimed their lands. The cultures that pre-existed them date back thousands more years, although archaeology has yet to pinpoint exactly how those evolved. Like many early cultures, they farmed and foraged. As they farmed more, their homes become more permanent and around 800 CE they began to create more durable homes that helped protect them from wild swings in the...

info_outline
The Evolution of Fonts on Computers show art The Evolution of Fonts on Computers

The History of Computing

Gutenburg shipped the first working printing press around 1450 and typeface was born. Before then most books were hand written, often in blackletter calligraphy. And they were expensive.    The next few decades saw Nicolas Jensen develop the Roman typeface, Aldus Manutius and Francesco Griffo create the first italic typeface. This represented a period where people were experimenting with making type that would save space. The 1700s saw the start of a focus on readability. William Caslon created the Old Style typeface in 1734. John Baskerville developed Transitional typefaces in 1757....

info_outline
Flight Part II: From Balloons to Autopilot to Drones show art Flight Part II: From Balloons to Autopilot to Drones

The History of Computing

In our previous episode, we looked at the history of flight - from dinosaurs to the modern aircraft that carry people and things all over the world. Those helped to make the world smaller, but UAVs and drones have had a very different impact in how we lead our lives - and will have an even more substantial impact in the future. That might not have seemed so likely in the 1700s, though - when unmann Unmanned Aircraft Napoleon conquered Venice in 1797 and then ceded control to the Austrians the same year. He then took it as part of a treaty in 1805 and established the first Kingdom of Italy....

info_outline
Flight: From Dinosaurs to Space show art Flight: From Dinosaurs to Space

The History of Computing

Humans have probably considered flight since they found birds. As far as 228 million years ago, the Pterosaurs used flight to reign down onto other animals from above and eat them. The first known bird-like dinosaur was the Archaeopteryx, which lived around 150 million years ago. It’s not considered an ancestor of modern birds - but other dinosaurs from the same era, the theropods, are. 25 million years later, in modern China, the Confuciusornis sanctus had feathers and could have flown. The first humans wouldn’t emerge from Africa until 23 million years later. By the 2300s BCE, the...

info_outline
SABRE and the Travel Global Distribution System show art SABRE and the Travel Global Distribution System

The History of Computing

Computing has totally changed how people buy and experience travel. That process seemed to start with sites that made it easy to book travel, but as with most things we experience in our modern lives, it actually began far sooner and moved down-market as generations of computing led to more consumer options for desktops, the internet, and the convergence of these technologies. Systems like SABRE did the original work to re-think travel - to take logic and rules out of the heads of booking and travel agents and put them into a digital medium. In so doing, they paved the way for future...

info_outline
The Story of Intel show art The Story of Intel

The History of Computing

We’ve talked about the history of microchips, transistors, and other chip makers. Today we’re going to talk about Intel in a little more detail.  Intel is short for Integrated Electronics. They were founded in 1968 by Robert Noyce and Gordon Moore. Noyce was an Iowa kid who went off to MIT to get a PhD in physics in 1953. He went off to join the Shockley Semiconductor Lab to join up with William Shockley who’d developed the transistor as a means of bringing a solid-state alternative to vacuum tubes in computers and amplifiers. Shockley became erratic after he won the Nobel Prize and...

info_outline
 
More Episodes

We’ve talked about the history of microchips, transistors, and other chip makers. Today we’re going to talk about Intel in a little more detail. 

Intel is short for Integrated Electronics. They were founded in 1968 by Robert Noyce and Gordon Moore. Noyce was an Iowa kid who went off to MIT to get a PhD in physics in 1953. He went off to join the Shockley Semiconductor Lab to join up with William Shockley who’d developed the transistor as a means of bringing a solid-state alternative to vacuum tubes in computers and amplifiers.

Shockley became erratic after he won the Nobel Prize and 8 of the researchers left, now known as the “traitorous eight.”  Between them came over 60 companies, including Intel - but first they went on to create a new company called Fairchild Semiconductor where Noyce invented the monolithic integrated circuit in 1959, or a single chip that contains multiple transistors. 

After 10 years at Fairchild, Noyce joined up with coworker and fellow traitor Gordon Moore. Moore had gotten his PhD in chemistry from Caltech and had made an observation while at Fairchild that the number of transistors, resistors, diodes, or capacitors in an integrated circuit was doubling every year and so coined Moore’s Law, that it would continue to to do so. They wanted to make semiconductor memory cheaper and more practical.

They needed money to continue their research. Arthur Rock had helped them find a home at Fairchild when they left Shockley and helped them raise $2.5 million in backing in a couple of days. 

The first day of the company, Andy Grove joined them from Fairchild. He’d fled the Hungarian revolution in the 50s and gotten a PhD in chemical engineering at the University of California, Berkeley. Then came Leslie Vadász, another Hungarian emigrant. Funding and money coming in from sales allowed them to hire some of the best in the business. People like Ted Hoff , Federico Faggin, and Stan Mazor.

That first year they released 64-bit static random-access memory in the 3101 chip, doubling what was on the market as well as the 3301 read-only memory chip, and the 1101. Then DRAM, or dynamic random-access memory in the 1103 in 1970, which became the bestselling chip within the first couple of years.

Armed with a lineup of chips and an explosion of companies that wanted to buy the chips, they went public within 2 years of being founded. 1971 saw Dov Frohman develop erasable programmable read-only memory, or EPROM, while working on a different problem. This meant they could reprogram chips using ultraviolet light and electricity.

In 1971 they also created the Intel 4004 chip, which was started in 1969 when a calculator manufacturer out of Japan ask them to develop 12 different chips. Instead they made one that could do all of the tasks of the 12, outperforming the ENIAC from 1946 and so the era of the microprocessor was born. And instead of taking up a basement at a university lab, it took up an eight of an inch by a sixth of an inch to hold a whopping 2,300 transistors. The chip didn’t contribute a ton to the bottom line of the company, but they’d built the first true microprocessor, which would eventually be what they were known for.

Instead they were making DRAM chips. But then came the 8008 in 1972, ushering in an 8-bit CPU. The memory chips were being used by other companies developing their own processors but they knew how and the Computer Terminal Corporation was looking to develop what was a trend for a hot minute, called programmable terminals. And given the doubling of speeds those gave way to microcomputers within just a few years.

The Intel 8080 was a 2 MHz chip that became the basis of the Altair 8800, SOL-20, and IMSAI 8080. By then Motorola, Zilog, and MOS Technology were hot on their heals releasing the Z80 and 6802 processors. But Gary Kildall wrote CP/M, one of the first operating systems, initially for the 8080 prior to porting it to other chips.

Sales had been good and Intel had been growing. By 1979 they saw the future was in chips and opened a new office in Haifa, Israiel, where they designed the 8088, which clocked in at 4.77 MHz. IBM chose this chip to be used in the original IBM Personal Computer. IBM was going to use an 8-bit chip, but the team at Microsoft talked them into going with the 16-bit 8088 and thus created the foundation of what would become the Wintel or Intel architecture, or x86, which would dominate the personal computer market for the next 40 years.

One reason IBM trusted Intel is that they had proven to be innovators. They had effectively invented the integrated circuit, then the microprocessor, then coined Moore’s Law, and by 1980 had built a 15,000 person company capable of shipping product in large quantities. They were intentional about culture, looking for openness, distributed decision making, and trading off bureaucracy for figuring out cool stuff.

That IBM decision to use that Intel chip is one of the most impactful in the entire history of personal computers. Based on Microsoft DOS and then Windows being able to run on the architecture, nearly every laptop and desktop would run on that original 8088/86 architecture. Based on the standards, Intel and Microsoft would both market that their products ran not only on those IBM PCs but also on any PC using the same architecture and so IBM’s hold on the computing world would slowly wither.

On the back of all these chips, revenue shot past $1 billion for the first time in 1983. IBM bought 12 percent of the company in 1982 and thus gave them the Big Blue seal of approval, something important event today. And the hits kept on coming with the 286 to 486 chips coming along during the 1980s.

Intel brought the 80286 to market and it was used in the IBM PC AT in 1984. This new chip brought new ways to manage addresses, the first that could do memory management, and the first Intel chip where we saw protected mode so we could get virtual memory and multi-tasking.  All of this was made possible with over a hundred thousand transistors. At the time the original Mac used a Motorola 68000 but the sales were sluggish while they flourished at IBM and slowly we saw the rise of the companies cloning the IBM architecture, like Compaq. Still using those Intel chips. 

Jerry Sanders had actually left Fairchild a little before Noyce and Moore to found AMD and ended up cloning the instructions in the 80286, after entering into a technology exchange agreement with Intel. This led to AMD making the chips at volume and selling them on the open market. AMD would go on to fast-follow Intel for decades.

The 80386 would go on to simply be known as the Intel 386, with over 275,000 transistors. It was launched in 1985, but we didn’t see a lot of companies use them until the early 1990s. The 486 came in 1989. Now we were up to a million transistors as well as a math coprocessor. We were 50 times faster than the 4004 that had come out less than 20 years earlier. 

I don’t want to take anything away from the phenomenal run of research and development at Intel during this time but the chips and cores and amazing developments were on autopilot. The 80s also saw them invest half a billion in reinvigorating their manufacturing plants. With quality manufacturing allowing for a new era of printing chips, the 90s were just as good to Intel. I like to think of this as the Pentium decade with the first Pentium in 1993. 32-bit here we come. Revenues jumped 50 percent that year closing in on $9 billion.

Intel had been running an advertising campaign around Intel Inside. This represented a shift from the IBM PC to the Intel. The Pentium Pro came in 1995 and we’d crossed 5 million transistors in each chip. And the brand equity was rising fast. More importantly, so was revenue. 1996 saw revenues pass $20 billion. The personal computer was showing up in homes and on desks across the world and most had Intel Inside - in fact we’d gone from Intel inside to Pentium Inside.

1997 brought us the Pentium II with over 7 million transistors, the Xeon came in 1998 for servers, and 1999 Pentium III. By 2000 they introduced the first gigahertz processor at Intel and they announced the next generation after Pentium: Itanium, finally moving the world to the 64 bit processor. 

As processor speeds slowed they were able to bring multi-core processors and massive parallelism out of the hallowed halls of research and to the desktop computer in 2005.

2006 saw Intel go from just Windows to the Mac. And we got 45 nanometer logic technology in 2006 using hafnium-based high-k for transistor gates represented a shift from the silicon-gated transistors of the 60s and allowed them to move to hundreds of millions of transistors packed into a single chip. i3, i5, i7, an on. The chips now have over a couple hundred million transistors per core with 8 cores on a chip potentially putting us over 1.7 or 1.8 transistors per chip.

Microsoft, IBM, Apple, and so many others went through huge growth and sales jumps then retreated dealing with how to run a company of the size they suddenly became. This led each to invest heavily into ending a lost decade effectively with R&D - like when IBM built the S/360 or Apple developed the iMac and then iPod.

Intel’s strategy had been research and development. Build amazing products and they sold. Bigger, faster, better. The focus had been on power. But mobile devices were starting to take the market by storm. And the ARM chip was more popular on those because with a reduced set of instructions they could use less power and be a bit more versatile. 

Intel coined Moore’s Law. They know that if they don’t find ways to pack more and more transistors into smaller and smaller spaces then someone else will. And while they haven’t been huge in the RISC-based System on a Chip space, they do continue to release new products and look for the right product-market fit. Just like they did when they went from more DRAM and SRAM to producing the types of chips that made them into a powerhouse. And on the back of a steadily rising revenue stream that’s now over $77 billion they seem poised to be able to whether any storm. Not only on the back of R&D but also some of the best manufacturing in the industry. 

Chips today are so powerful and small and contain the whole computer from the era of those Pentiums. Just as that 4004 chip contained a whole ENIAC. This gives us a nearly limitless canvas to design software. Machine learning on a SoC expands the reach of what that software can process. Technology is moving so fast in part because of the amazing work done at places like Intel, AMD, and ARM. Maybe that positronic brain that Asimov promised us isn’t as far off as it seems. But then, I thought that in the 90s as well so I guess we’ll see.