loader from loading.io

Taiwan, TSMC, NVIDIA, and Foundries

The History of Computing

Release Date: 09/30/2022

Lotus: From Yoga to Software show art Lotus: From Yoga to Software

The History of Computing

Nelumbo nucifera, or the sacred lotus, is a plant that grows in flood plains, rivers, and deltas. Their seeds can remain dormant for years and when floods come along, blossom into a colony of plants and flowers. Some of the oldest seeds can be found in China, where they’re known to represent longevity. No surprise, given their level of nitrition and connection to the waters that irrigated crops by then. They also grow in far away lands, all the way to India and out to Australia. The flower is sacred in Hinduism and Buddhism, and further back in ancient Egypt. Padmasana is a Sanskrit term...

info_outline
Section 230 and the Concept of Internet Exceptionalism show art Section 230 and the Concept of Internet Exceptionalism

The History of Computing

We covered computer and internet copyright law in a previous episode. That type of law began with interpretations that tried to take the technology out of cases so they could be interpreted as though what was being protected was a printed work, or at least it did for a time. But when it came to the internet, laws, case law, and their knock-on effects, the body of jurisprudence work began to diverge.  Safe Harbor mostly refers to the Online Copyright Infringement Liability Limitation Act, or OCILLA for short, was a law passed in the late 1980s that  shields online portals and internet...

info_outline
Bluetooth: From Kings to Personal Area Networks show art Bluetooth: From Kings to Personal Area Networks

The History of Computing

Bluetooth The King Ragnar Lodbrok was a legendary Norse king, conquering parts of Denmark and Sweden. And if we’re to believe the songs, he led some of the best raids against the Franks and the the loose patchwork of nations Charlemagne put together called the Holy Roman Empire.  We use the term legendary as the stories of Ragnar were passed down orally and don’t necessarily reconcile with other written events. In other words, it’s likely that the man in the songs sung by the bards of old are likely in fact a composite of deeds from many a different hero of the norse.   Ragnar...

info_outline
One History Of 3D Printing show art One History Of 3D Printing

The History of Computing

One of the hardest parts of telling any history, is which innovations are significant enough to warrant mention. Too much, and the history is so vast that it can't be told. Too few, and it's incomplete. Arguably, no history is ever complete. Yet there's a critical path of innovation to get where we are today, and hundreds of smaller innovations that get missed along the way, or are out of scope for this exact story. Children have probably been placing sand into buckets to make sandcastles since the beginning of time. Bricks have survived from round 7500BC in modern-day Turkey where humans made...

info_outline
Adobe: From Pueblos to Fonts and Graphics to Marketing show art Adobe: From Pueblos to Fonts and Graphics to Marketing

The History of Computing

The Mogollon culture was an indigenous culture in the Western United States and Mexico that ranged from New Mexico and Arizona to Sonora, Mexico and out to Texas. They flourished from around 200 CE until the Spanish showed up and claimed their lands. The cultures that pre-existed them date back thousands more years, although archaeology has yet to pinpoint exactly how those evolved. Like many early cultures, they farmed and foraged. As they farmed more, their homes become more permanent and around 800 CE they began to create more durable homes that helped protect them from wild swings in the...

info_outline
The Evolution of Fonts on Computers show art The Evolution of Fonts on Computers

The History of Computing

Gutenburg shipped the first working printing press around 1450 and typeface was born. Before then most books were hand written, often in blackletter calligraphy. And they were expensive.    The next few decades saw Nicolas Jensen develop the Roman typeface, Aldus Manutius and Francesco Griffo create the first italic typeface. This represented a period where people were experimenting with making type that would save space. The 1700s saw the start of a focus on readability. William Caslon created the Old Style typeface in 1734. John Baskerville developed Transitional typefaces in 1757....

info_outline
Flight Part II: From Balloons to Autopilot to Drones show art Flight Part II: From Balloons to Autopilot to Drones

The History of Computing

In our previous episode, we looked at the history of flight - from dinosaurs to the modern aircraft that carry people and things all over the world. Those helped to make the world smaller, but UAVs and drones have had a very different impact in how we lead our lives - and will have an even more substantial impact in the future. That might not have seemed so likely in the 1700s, though - when unmann Unmanned Aircraft Napoleon conquered Venice in 1797 and then ceded control to the Austrians the same year. He then took it as part of a treaty in 1805 and established the first Kingdom of Italy....

info_outline
Flight: From Dinosaurs to Space show art Flight: From Dinosaurs to Space

The History of Computing

Humans have probably considered flight since they found birds. As far as 228 million years ago, the Pterosaurs used flight to reign down onto other animals from above and eat them. The first known bird-like dinosaur was the Archaeopteryx, which lived around 150 million years ago. It’s not considered an ancestor of modern birds - but other dinosaurs from the same era, the theropods, are. 25 million years later, in modern China, the Confuciusornis sanctus had feathers and could have flown. The first humans wouldn’t emerge from Africa until 23 million years later. By the 2300s BCE, the...

info_outline
SABRE and the Travel Global Distribution System show art SABRE and the Travel Global Distribution System

The History of Computing

Computing has totally changed how people buy and experience travel. That process seemed to start with sites that made it easy to book travel, but as with most things we experience in our modern lives, it actually began far sooner and moved down-market as generations of computing led to more consumer options for desktops, the internet, and the convergence of these technologies. Systems like SABRE did the original work to re-think travel - to take logic and rules out of the heads of booking and travel agents and put them into a digital medium. In so doing, they paved the way for future...

info_outline
The Story of Intel show art The Story of Intel

The History of Computing

We’ve talked about the history of microchips, transistors, and other chip makers. Today we’re going to talk about Intel in a little more detail.  Intel is short for Integrated Electronics. They were founded in 1968 by Robert Noyce and Gordon Moore. Noyce was an Iowa kid who went off to MIT to get a PhD in physics in 1953. He went off to join the Shockley Semiconductor Lab to join up with William Shockley who’d developed the transistor as a means of bringing a solid-state alternative to vacuum tubes in computers and amplifiers. Shockley became erratic after he won the Nobel Prize and...

info_outline
 
More Episodes

Taiwan is a country about half the size of Maine with about 17 times the population of that state. Taiwan sits just over a hundred miles off the coast of mainland China. It’s home to some 23 and a half million humans, roughly half way between Texas and Florida or a few more than live in Romania for the Europeans. Taiwan was connected to mainland China by a land bridge in the Late Pleistocene and human remains have been found dating back to 20,000 to 30,000 years ago. About half a million people on the island nation are aboriginal, or their ancestors are from there. But the population became more and more Chinese in recent centuries.

Taiwan had not been part of China during the earlier dynastic ages but had been used by dynasties in exile to attack one another and so became a part of the Chinese empire in the 1600s. Taiwan was won by Japan in the late 1800s and held by the Japanese until World War II. During that time, a civil war had raged on the mainland of China with the Republic of China eventually formed as the replacement government for the Qing dynasty following a bloody period of turf battles by warlords and then civil war.

Taiwan was in martial law from the time the pre-communist government of China retreated there during the exit of the Nationalists from mainland China in the 1940s to the late 1980. During that time, just like the exiled Han dynasty, they orchestrated war from afar. They stopped fighting, much like the Koreans, but have still never signed a peace treaty. And so large parts of the world remained in stalemate. 

As the years became decades, Taiwan, or the Republic of China as they still call themselves, has always had an unsteady relationship with the People’s Republic of China, or China as most in the US calls them. The Western world recognized the Republic of China and the Soviet and Chines countries recognized the mainland government. US President Richard Nixon visited mainland China in 1972 to re-open relations with the communist government there and relations slowly improved.

The early 1970s was a time when much of the world still recognized the ruling government of Taiwan as the official Chinese government and there were proxy wars the two continued to fight. The Taiwanese and Chinese still aren’t besties. There are deep scars and propaganda that keep relations from being repaired. 

Before World War II, the Japanese also invaded Hong Kong. During the occupation there, Morris Chang’s family became displaced and moved to a few cities during his teens before he moved Boston to go to Harvard and then MIT where he did everything to get his PhD except defend his thesis. He then went to work for Sylvania Semiconductor and then Texas Instruments, finally getting his PhD from Stanford in 1964. He became a Vice President at TI and helped build an early semiconductor designer and foundry relationship when TI designed a chip and IBM manufactured it.

The Premier of Taiwan at the time, Sun Yun-suan, who played a central role in Taiwan’s transformation from an agrarian economy to a large exporter. His biggest win was when to recruit Chang to move to Taiwan and found TSCM, or Taiwan Semiconductor Manufacturing Company. Some of this might sound familiar as it mirrors stories from companies like Samsung in South Korea. In short, Japanese imperialism, democracies versus communists, then rapid economic development as a massive manufacturing powerhouse in large part due to the fact that semiconductor designers were split from semiconductor foundry’s or where chips are actually created. 

In this case, a former Chinese national was recruited to return as founder and led TSMC for 31 years before he retired in 2018. Chang could see from his time with TI that more and more companies would design chips for their needs and outsource manufacturing. They worked with Texas Instruments, Intel, AMD, NXP, Marvell, MediaTek, ARM, and then the big success when they started to make the Apple chips. The company started down that path in 2011 with the A5 and A6 SoCs for iPhone and iPad on trial runs but picked up steam with the A8 and A9 through A14 and the Intel replacement for the Mac, the M1. They now sit on a half trillion US dollar market cap and are the largest in Taiwan. For perspective, their market cap only trails the GDP of the whole country by a few billion dollars. 

Nvidia
TSMC is also a foundry Nvidia uses. As of the time of this writing, Nvidia is the 8th largest semiconductor company in the world. We’ve already covered Broadcom, Qualcomm, Micron, Samsung, and Intel. Nvidia is a fabless semiconductor company and so design chips that vendors like TSMC manufacture. 

Nvidia was founded by Jensen Huang, Chris Malachowsky, and Curtis Priem in 1993 in Santa Clara, California (although now incorporated in Delaware). Not all who leave the country they were born in due to war or during times of war return. Huang was born in Taiwan and his family moved to the US right around the time Nixon re-established relations with mainland China. Huang then went to grad school at Stanford before he became a CPU designer at AMD and a director at LSI Logic, so had experience as a do-er, a manager, and a manager’s manager. 

He was joined by Chris Malachowsky and Curtis Priem, who had designed the IBM Professional Graphics Adapter and then the GX graphics chip at Sun.   because they saw this Mac and Windows and Amiga OS graphical interface, they saw the games one could play on machines, and they thought the graphics cards would be the next wave of computing. And so for a long time, Nvidia managed to avoid competition with other chip makers with a focus on graphics. That initially meant gaming and higher end video production but has expanded into much more like parallel programming and even cryptocurrency mining.  

They were more concerned about the next version of the idea or chip or company and used NV in the naming convention for their files. When it came time to name the company, they looked up words that started with those letters, which of course don’t exist - so instead chose invidia or Nvidia for short, as it’s latin for envy - what everyone who saw those sweet graphics the cards rendered would feel. 

They raised $20 million in funding and got to work. First with SGS-Thomson Microelectronics in 1994 to manufacture what they were calling a graphical-user interface accelerator that they packaged on a single chip. They worked with Diamond Multimedia Systems to install the chips onto the boards. In 1995 they released NV1. The PCI card was sold as Diamond Edge 3D and came with a 2d/3d graphics core with quadratic texture mapping. Screaming fast and Virtual Fighter from Sega ported to the platform. 

DirectX had come in 1995. So Nviia released DirectX drivers that supported Direct3D, the api that Microsoft developed to render 3d graphics. This was a time when 3d was on the rise for consoles and desktops. Nvidia timed it perfectly and reaped the rewards when they hit a million sold in the first four months for the RIVA, a 128-bit 3d processor that got used as an OEM in 1997. Then the 1998 RIVAZX with RIVATNT for multi-texture 3D processing. They also needed more manufacturing support at this point and entered into a strategic partnership with TSMC to manufacture their boards.

A lot of vendors had a good amount of success in their niches. By the late 1990s there were companies who made memory, or the survivors of the DRAM industry after ongoing price dumping issues. There were companies that made central processors like Intel. Nvidia led the charge for a new type of chip, the GPU. They invented the GPU in 1999 when they released the GeForce 256. This was the first single-chip GPU processor. This means integrated lightings, triangle setups, rendering, like the old math coprocessor but for video. Millions of polygons could be drawn on screens every second. They also released the Quadro Pro GPU for professional graphics and went public in 1999 at an IPO of $12 per share. 

Nvidia used some of the funds from the IPO to scale operations, organically and inorganically. In 2000 they released the GeForce2 Go for laptops and acquired 3dfx, closing deals to get their 3d chips in devices from OEM manufacturers who made PCs and in the new Microsoft Xbox. By 2001 they hit $1 billion in revenues and released the GeForce 3 with a programmable GPU, using APIs to make their GPU a platform. They also released the nForce integrated graphics and so by 2002 hit 100 million processors out on the market.

They acquired MediaQ in 2003 and partnered with game designer Blizzard to make Warcraft. They continued their success in the console market when the GeForce platform was used in the PS 3 in 2005 and by 2006 had sold half a billion processors. They also added the  CUDA architecture that year to put a general purpose GPU on the market and acquired Hybrid Graphics who develops 2D and 3D embedded software for mobile devices.

In 2008 they went beyond the consoles and PCs when Tesla used their GPUs in cars. They also acquired PortalPlayer, who supplies semiconductors and software for personal media players and launched the Tegra mobile processor to get into the exploding mobile market. More acquisitions in 2008 but a huge win when the GeForce 9400M was put into Apple MacBooks. Then more smaller chips in 2009 when the Tegra processors were used in Android devices.

They also continued to expand how GPUs were used. They showed up in Ultrasounds and in 2010 the Audi. By then they had the Tianhe-1A ready to go, which showed up in supercomputers and the Optimus. All these types of devices that could use a GPU meant they hit a billion processors sold in 2011, which is when they went dual core with the Tegra 2 mobile processor and entered into cross licensing deals with Intel. 

At this point TSMC was able to pack more and more transistors into smaller and smaller places. This was a big year for larger jobs on the platform. By 2012, Nvidia got the Kepler-based GPUs out by then and their chips were used in the Titan supercomputer. They also released a virtualized GPU GRID for cloud processing. 

It wasn’t all about large-scale computing efforts. The Tegra-3 and GTX 600 came out in 2012 as well. Then in 2013 the Tegra 4, a quad-core mobile processor, a 4G LTE mobile processor, Nvidia Shield for portable gaming, the GTX Titan, a grid appliance. In 2014 the Tegra K1 192, a shield tablet, and Maxwell. In 2015 came the TegraX1 with deep learning with 256 cores and Titan X and Jetson TX1 for smart machines, and the Nvidia Drive for autonomous vehicles. They continued that deep learning work with an appliance in 2016 with the DGX-1. The Drive got an update in the form of PX 2 for in-vehicle AI. By then, they were a 20 year old company and working on the 11th generation of the GPU and most CPU architectures had dedicated cores for machine learning options of various types. 

2017 brought the Volta, Jetson TX2, and SHIELD was ported over to the Google Assistant. 2018 brought the Turing GPU architecture, the DGX-2, AGX Xavier, Clara, 2019 brought AGX Orin for robots and autonomous or semi-autonomous piloting of various types of vehicles. They also made the Jetson Nano and Xavier, and EGX for Edge Computing. At this point there were plenty of people who used the GPUs to mine hashes for various blockchains like with cryptocurrencies and the ARM had finally given Intel a run for their money with designs from the ARM alliance showing up in everything but a Windows device (so Apple and Android). So they tried to buy ARM from SoftBank in 2020. That deal fell through eventually but would have been an $8 billion windfall for Softbank since they paid $32 billion for ARM in 2016. 

We probably don’t need more consolidation in the CPU sector. Standardization, yes. Some of top NVIDIA competitors include Samsung, AMD, Intel Corporation Qualcomm and even companies like Apple who make their own CPUs (but not their own GPUs as of the time of this writing). In their niche they can still make well over $15 billion a year. 

The invention of the MOSFET came from immigrants Mohamed Atalla, originally from Egypt, and Dawon Kahng, originally from from Seoul, South Korea. Kahng was born in Korea in 1931 but immigrated to the US in 1955 to get his PhD at THE Ohio State University and then went to work for Bell Labs, where he and Atalla invented the MOSFET, and where Kahng retired. The MOSFET was an important step on the way to a microchip. 

That microchip market with companies like Fairchild Semiconductors, Intel, IBM, Control Data, and Digital Equipment saw a lot of chip designers who maybe had their chips knocked off, either legally in a clean room or illegally outside of a clean room. Some of those ended in legal action, some didn’t. But the fact that factories overseas could reproduce chips were a huge part of the movement that came next, which was that companies started to think about whether they could just design chips and let someone else make them. That was in an era of increasing labor outsourcing, so factories could build cars offshore, and the foundry movement was born - or companies that just make chips for those who design them. 

As we have covered in this section and many others, many of the people who work on these kinds of projects moved to the United States from foreign lands in search of a better life. That might have been to flee Europe or Asian theaters of Cold War jackassery or might have been a civil war like in Korea or Taiwan. They had contacts and were able to work with places to outsource too and given that these happened at the same time that Hong Kong, Singapore, South Korea, and Taiwan became safe and with no violence. And so the Four Asian Tigers economies exploded, fueled by exports and a rapid period of industrialization that began in the 1960s and continues through to today with companies like TSMC, a pure play foundry, or Samsung, a mixed foundry - aided by companies like Nvidia who continue to effectively outsource their manufacturing operations to companies in the areas. At least, while it’s safe to do so. 

We certainly hope the entire world becomes safe. But it currently is not. There are currently nearly a million Rohingya refugees fleeing war in Myanmar. Over 3.5 million have fled the violence in Ukraine. 6.7 million have fled Syria. 2.7 million have left Afghanistan. Over 3 million are displaced between Sudan and South Sudan. Over 900,000 have fled Somalia. Before Ukranian refugees fled to mostly Eastern European countries, they had mainly settled in Turkey, Jordan, Lebanon, Pakistan, Uganda, Germany, Iran, and Ethiopia. Very few comparably settled in the 2 largest countries in the world: China, India, or the United States. 

It took decades for the children of those who moved or sent their children abroad to a better life to be able to find a better life. But we hope that history teaches us to get there faster, for the benefit of all.