Integrated Circuits (Part 2)

Resources

Video Script

The device that Jack Kilby pioneered is shown here, it is the first integrated circuit. What we have here is a piece of germanium with several components printed directly on to the semiconductor material and a few wires coming off of it. And while it may not look like much now, if you connect the wires up properly, you can actually see a sine wave on an oscilloscope produced by this device. They call this device an integrated circuit because the circuit between all of the components and the wiring is all connected directly onto the piece of germanium.

Now, of course, you might realize that we don’t make computer circuits out of germanium today, and so there’s a little bit more work that needed to be done. This lies in the work of another engineer named Robert Noyce. Robert Noyce worked at Fairchild Semiconductor and was working on a similar idea to Jack Kilby’s. However, he decided that instead of using germanium he would build an integrated circuit using silicon a very similar element. And it turns out that using silicon was a much better choice, and he was able to overcome some of the design flaws of Kilby’s design working independently at about the same time.

A few years later, Robert Noyce left Fairchild Semiconductor along with Gordon Moore, another engineer to create their own company focused on developing and building these integrated circuits. Do you want to guess what that company is? That company is Intel. It was founded by Robert Noyce, Gordon Moore and Andy Grove, pictured here in 1968. And unlike Fairchild Semiconductor, who didn’t really see as much value in the integrated circuit, Intel very quickly realized that the integrated circuits would be the future. And so they focused on designing, creating and manufacturing these new integrated circuits using semiconductors.

Gordon Moore is another important engineer in the history of computer science. You’ve probably heard of Gordon Moore based on his namesake Moore’s law in 1960 Five Gordon Moore wrote a paper called cramming more circuits onto integrated components. And in that paper, he discussed what the future of these integrated circuits might look like. He predicted that the number of circuits on a chip could double about every year to 18 months for another 10 years at most. And of course, it turns out he was much more correct than even he thoughts shown in the graph. Here we have the number of transistors on a particular chip versus the date of introduction. And you’ll notice that throughout the lifetime of computers all the way through about 2010 Moore’s Law held very fast, it was a very good way of measuring the quick growth of power among computer chips.

So in 1971, Intel finally released their first central processing unit or CPU, the Intel 4004, which is pictured here. This chip may look very small, but it had 2300 transistors built on it all in the size of a small finger. And the circuits inside of here are 10 times smaller than the human hair. This was a really revolutionary chip and it was used in a variety of places. In fact, one of the first devices to use this chip was an adding machine such as the ones you see at banks or accounting offices today. This chip is a better example of a microcontroller. A microcontroller is a chip that includes everything a device needs to think in function all on a single little chip.

And so this is a chip similar to an Intel 4004 that’s had the top shaved off. And so if that chip is the size of a finger, this little interior part is the size of the fingernail on that finger. And inside of there is where the transistors actually lie. And so here you can see those filaments connecting to the wires that are smaller than a human hair that make this device work. These microcontrollers form the basis of today’s modern computers, but that’s only part of the story. The other part of the story is looking at how we make today’s modern computers actually react to input and operate in the real world will take Look at that in the rest of this module.