Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Indistinguishable from Magic: Manufacturing Modern Computer Chips (2012) [video] (youtube.com)
183 points by taspeotis on Jan 18, 2018 | hide | past | favorite | 52 comments


Even old computer chips were magic. Really, photolithography is magic—it jumped us way ahead in our ability to miniaturize. It really seems like the kind of thing that, in fiction, would be introduced to a culture by aliens.

I've always wondered: could we have discovered/invented photolithography earlier, and thereby started the integrated-circuit revolution earlier? As far as I can tell, there was nothing holding someone back from inventing it in the 1800s, even: they had access to all the relevant chemicals, and (in some countries) lenses were already precision-ground enough to serve the purpose down to some pretty tiny process-nodes. We wouldn't have had digital logic (no transistors), but we could have been making "electronic" watches, tiny AC-to-DC power converters, and other analogue ICs even back then.

But heck: suppose we had known the principles behind semiconductive materials back then, too. We could have passed right by the vacuum-tube era and started in on inventing digital ICs (and boolean algebra.) What could the 1800s have looked like then?

(Is there any well-known science-fiction story exploring this premise? I might give a shot to writing it, if not...)


Well, in the 1800s steam locomotives were considered to be awesome high-tech. The famous HMS Rattler vs. Alecto trial which proved the superiority of propellers over paddles for ship propulsion was in 1845.

Even on the basic science level, Maxwell published his seminal treatise on classical electromagnetism in 1873, and it wasn't until 1881 that Heaviside published them in the form we know them today and that sort-of made people understand them. Semiconductors were sort-of known already in the early 1800s, but it wasn't until the, say, first half of the 20th century when we really started to understand them with the development of solid state physics (both theoy and experimental methods such as x-ray diffraction etc.).

So yes, while some aspects of lithography could have been developed in the 1800s, I think there was quite a huge gap between that and a whole lot of other things needed.


We didn't have the semiconductor science needed to make effective ICs until we started building them.

The logic is relatively easy, but the materials science needed to decide what to lithograph, using which materials, and in what order, is post-WWII, and probably couldn't have been invented much earlier.

It takes a lot of work to work out ideal doping factors and diffusion rates, estimate charge distributions across junctions, calculate propagation speeds, and eliminate leakage.

But more - the chemistry of semiconductor doping and deposition is often rather nasty, and it would have been a huge challenge to industrialise it before WWII.


I think this is a pretty reasonable answer...summed up, you could have all the steps but you were missing two things:

1) the theoretical knowledge of how the devices worked, which enabled the creation of steps in the process that resulted in fully working devices

2) the ability do do each step repeatably and precisely and reliably.


well most chips these days are done with FETs which don't really need semiconductors in the way we usually think about them ... and FETs were invented (and largely forgotten) before bipolar transistors ... but doped semiconductors does make it easy to make FETs


We didn't even have a practical vacuum tube AND gate until 1930, and a formal treatment of logic design until 1936-1937. It's doubtful the need for ICs was well understood. And they would be expensive until there was a big enough market to produce them at scale.


IMO there were enough critical precursor technologies to start making discrete semiconductor devices from silicon by about the late 1920s, had a large industrial backer known the useful properties of such devices. They had most of the pieces for "how" by then but didn't discover the "why" until a generation later. And of course people spent a lot of time pursuing dead ends before discovering which industrial techniques would successfully combine to make silicon suitable for semiconductor devices, with properties surprisingly different from silicon of only 99.5% purity.

- The electric arc furnace, from the late 19th century, enabled the large scale production of crude silicon.

- The crystal bar process, from 1925, is similar to the Siemens process developed in the 1950s for refining purified volatile silicon compounds to electronic-purity elemental silicon.

- The Czochralski process, from 1915, permitted the growth of large single crystals of purified silicon.

There's a great review article from 1981, available through sci-hub, if you want more details about the history of purified silicon electronic devices:

"Twenty Five Years of Semiconductor-Grade Silicon"

DOI: 10.1002/pssa.2210640102


I don't think there was much incentive to make things smaller before the transistor was invented. Also, transistors are used in most analog electronics. You can't do much without them since they let you amplify signals and apply negative feedback.

But to answer your second question. I don't think we could have skipped the vacuum-tube era. It's because we already had applications for those sorts of electronics (and unreliable mechanical relays that were replaced with transistor switches) that researchers realized the potential for transistors in the first place. Even after the transistor was invented there was still a lot of research necessary to make them feasible in practice. The first one was germanium if I recall correctly and it took a while before silicon transistors were viable.

Edit: Also, sort of unrelated, non silicon semiconductors are still used a lot because they can switch faster, and different color LEDs use different semiconductors because they emit different colors of light, which is also related to why different color LEDs have different voltage drops.


One thing that everybody seems to glance over is that photolitography made manufacturing of discrete transistors practical and the step to integrated circuits is not that big from that.

Edit: what you buy as a typical discrete transistor is in fact integrated circuit which contains tens to thousands of parallel connected transistors.

Also I believe that photolitography-like processes were used before semiconductors (and I would not be too surprised if production of at least some mass manufactured valves involved photolitography).


This is true for FETs but not for BJTs


I originally intended to use IRF640 as an example, as for HexFET it is painfully obvious that there are separate parallel connected transistors.

For typical BJTs it depends on how you define "separate transistor". The structure in typical BJT has one contiguos junction, but its geometry is non-trivial enough that you can well think of it as multiple paralleled transistors.


I can highly recommend this documentary from 1943, which shows the manufacturing of piezoelectric quartz crystals. The amount of skilled manual labor involved in the process is quite staggering.

https://www.youtube.com/watch?v=b--FKHCFjOM

Photolithography in general substantially pre-dates the development of the IC. We were using it to produce printing plates in the 19th century. Breakthroughs in chemistry made photolithographic PCB manufacture possible by the early 1940s. It took us another 20 years and a huge array of discoveries and inventions to finally start producing useful ICs.

Development of IC manufacturing required a very broad tech tree - better chemistry, better optics, better materials science, better process engineering and, most crucially, the invention of p-n junctions. Without the p-n junction, any effort to etch a practical integrated circuit is pretty much futile.


Don't know what you mean by analogue ICs without semiconductors.

Anyway, perhaps the problem was more that (almost) nobody back then saw the potential of this technology.


Think about a PCB except at micro-chip scale. If you don't have semi-conductor technology you can still lay down layers of metal using photolithography, and thus making tiny circuits. I'm not sure what the limits are of such technology but it seems interesting.


> What could the 1800s have looked like then?

If it happened before the Battle of Jena, we might have found ourselves in a world of technologically-ensconced monarchies.


Hey now, we still might.


It's not exactly what you're looking for but in that vein The End of the World has a similar theme.


Hi...presenter here. This has been posted a couple times and it's always flattering and a bit nerve wracking.

Happy to answer questions, there are a couple of errors and misspeaks. Note I'm not in that field anymore (and,thankfully, I'm in much better physical shape).


I’d be very interested in hearing a quick summary of the last six years. Particularly how they compare to your predictions in the future slide at the end of your presentation (about 51 minutes in).


Great! I'm the uploader (to YouTube, not HN). I was present at the time and also gave a talk at that HOPE. I loved your talk and I'm happy to see it surface again here by surprise.

I uploaded it at the time as I wanted to subtitle this for my deaf (and also non English speaker) teacher, but I lost the password for that YouTube account mid progress. Maybe I should try to reset it.

Thank you!


REALLY!?

1) thank you for the effort to make cons more inclusive

2) would love to help...would having the slides help?


Could you send the subtitles to me too? That would be really interesting! (and if your teacher is non-English-speaker, a translation to Chinese would be amazing if you have that).


I'm trying to find the access to that old YouTube account. I can't remember how much I got to subtitle, maybe around 20-30%, but I'd like to finish that work :)


I'm happy to send you the actual slides if that helps as well.


I'm just curious as to what you're doing now if you're no longer in that field.


sorry, wasn't great about checking this.

Right now, finishing up a PhD in engineering education where I study the development of engineering students' beliefs about knowledge and how those interact with their development of entrepreneurial competence.


What's the single best way to get involved in the industry? Should I bother getting a degree or just go straight for hands on manufacturing experience? Are there any universities known for good programs?


It depends on what you want to do. Do you want to just monitor tools running production? Do you want to actively work on the tools in the fab? Want to be a process engineer? Be a device design engineer? Work in process development?

Each is different...a college degree will never hurt and a master's is pretty normal for fab engineers and a PhD for process developers.

As for schools it definitely depends in a similar way, but most top technical schools will get you recruited. A standout is Rochester institute of technology which has an undergrad degree in microelectronics.


I studied Electronic Systems Engineering at Lancaster University in the UK, and graduated with a Masters in 2011.

Now I'm working for a tech company in Taiwan, who do the next process step: dicing the wafers into individuals dies, and packaging the dies into chips. There's a trolley downstairs with over 1 petabyte - stacked high with trays of 16GB memory cards from the testing machine.

Unfortunately I only managed to get a job writing monitoring software, equipment control drivers, and business planning software, so I don't get to handle the wafers myself.

In my opinion, you'll need to learn Chinese if you want to work for the lithography companies, or work for the German/Japanese/Swiss companies making the machines that take in the FOUPs. If you want to simply be a technician, most of the migrant workers who operate the machines here come from the Philippines and get paid very little while working very long hours with no path to promotion (all the line managers are Taiwanese). On the bright side though, the technicians speak better English, so it's easier to talk to them directly!

I enjoyed the circuit design/FPGA programming courses in university, but the projects to design ASICs usually take several years, so I couldn't get work experience thorough summer jobs (which is how I ended up in software).


I used to be in that industry back 20 years ago helping design one of the major machines still in production. Since then I've moved onto many different things. The best way to get into this field is get a materials science/engineering degree. Sadly there has been lots of consolidation among fabs plus most of them moved overseas so hard to get any hands on experience. They also have lots of technician roles in the fabs but with automation and improved reliability, maybe not so easy to get. Your best shot for hands on exposure would be a equipment manufacturer like Applied Materials or Lam Research.


Back when i was majoring in ECE at UIUC, we got to play around with Fab labs :). Yeah you'd pretty much need a advanced degree in EE or Physics to be a semi-conductor engineer I'd reckon.


Surely there are entry level technician jobs though?


For those you are typically looking at an as or even a bs. They look for strong reliability, precision and attention to detail, and hands on skills. Strong evidence of an ability to learn is also gold.

Plus most of those guys work 3/4 12hr shifts a week...great schedule. No work goes home with you.


Tell us more about clean rooms!

What I remember from class is that it's a fan on the roof, blowing air through a very fine filter.

The most magic part for me is that this still works even though the PM2.5 pollution outside is terrible (the AQI is over 150 today, which causes me to sneeze and get red eyes, and is known to increase the risk of cancer, so everyone wears dust masks outside).


enormous spaces with laminar flow. So cool! most relaxing place in the world, wonderful background white noise for getting work done.

Most cleanrooms are now class 10 (meaning 10 half micron sized particles per square meter) compared to a class 1,000 operating room. The wafers are all sealed in the 'clearnoom within the cleanroom' spaces of tools and FOUPS.

I have heard of fabs having problems fro msources as strange as nearby farms and earthquakes half way around the world. The filtering (ultra-HEPA filters) will absolutely clear up your alergies almost over night.


You have a bunch of nice images. Are those taken with "standard" microscopes?

Also you mentioned a background in metrology. How do you inspect something a few atoms thick?


The device level images are a mix of Scanning Electron Microscope (SEM) and visual microscope images.

A bunch of different ways. Depends on the material being measured but measuring sheet resistance (effectively, you can back calculate the thickness by measuring the resistance of a square of known size) and light scattering off the surface are the most common.


Jeez, it really is like magic. Awesome presentation–thanks! (Also, congrats on getting in better shape.)


Thanks :) bot working 14hrs a day 6 days a week helps.


Any thoughts on germanium (not GeSi, but Ge without or with little Si) or carbon based chips?


Unfortunately, not really. I'm not super familiar with germanium. I have some experience with gallium arsenide.


Great presentation!


"We're making things smaller than the wavelength of light [that] we're using to make the features"

Incredible.


Without wanting to diminish the technological achievement behind that in any way, but it is really not that uncommon to create features that are smaller than the tools we use to make them. An obvious example are the details on a sculpture which can be much smaller than the chisel used to create them. Maybe not a perfect analogy because you only really use the edge and don't just throw the chisel in the rough direction of your marble block. Anyway.


A better analogy might be reading regular-sized braille writing by poking it with a broom handle.

Or writing an essay with the pencil tool in mspaint while using a 50-radius sized brush.


Awesome presentation.

To me the most magical part is not the manufacturing process, although that is very special, but how a team of engineers designed something which is made up of hundreds of millions of transistors, each in a specific location for a specific reason. How is all of that done? How do they keep track of things? Is there special software?


There are a number of disciplines involved.

Engineers at the foundry will range from those focused keeping the equipment aligned to produce functional structures on/in silicon over time, to engineers that design standard libraries of components to work at a given 'process' (the steps in going from wafer to chip for a given set of parameters/attributes).

The structures in silicon (or other semiconductor material) are the transistors mainly. These are created by embedding impurities in geographic locations and patterns in the silicon that change it's conductive properties. Embedding these impurities is called doping.

The structures on top of the silicon are conductors and insulators used for interconnecting all the transistors.

There will be engineers at the foundry (or a company that works with that foundry) that take a "netlist" from the customer and convert that into files that can be used by mask company to make masks for the layers of conductors and insulators the lay on top of the transistors, and also the patterns for doping the silicon.

Engineers at "tool vendors" will build software that takes a high level description of the desired functionality of the chip and turn it into the netlist that foundry engineers can use. The netlist is a decomposition of the high level description (such as x <= a * b) into the components in the "standard library" for that foundry. This library will contain basic components like AND gates and OR gates, but also more complicated things like IO pads and RAMs, and things like multipliers.

Then you have design and verification engineers at a given company that wants to build a chip. These tend to stay at the higher level, working in VHDL, Verilog, and SystemVerilog. Which are all pretty archaic by software language standards. And what is considered high level in this arena is about the C language level or below. So not so high from a software point of view.

I just spat this out off the top of my head, so apologies if I made mistakes.


I've watched this video several times (probably 5+ or maybe more) and it does not cease to amaze me.

Both the presentation style and the content itself is really engaging.


I know that there is very high tech development going on around the world by huge organizations, and it would blow my mind with how advanced it is.

But watching this made me feel like the computer I'm typing this on is some space age alien tech. It's a little mind blowing how inexpensive it is to purchase stupidly complex hardware.


I can't find a long video animation depicting the process. It was a bit old, but it was so great at displaying the process of coating a layer with masks and etching them with acid. If I remember correctly it was repeated 5 or 7 times.

After watching it, I concluded we are gods for manipulating matter at this level :)


Love this video and highly recommend. Love the idea also of the slides via a barcode.


thanks!

Its something I'm a big fan of. A talk like this is inherently a ton of information in a short time and I try to be inclusive and accessible by being conscious of how others process information.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: