All the latest quantum computer articles

See the latest stories on quantum computing from eeNews Europe

Friday, August 24, 2018

Calibration errors in fab equipment cost chipmakers millions of dollars

By Nick Flaherty www.flaherty.co.uk

A study from the US National Institute of Standards and Technology (NIST) has uncovered a source of error in an industry-standard calibration method that could lead chip makers to lose a million dollars or more in a single wafer run.

The error occurs when measuring very small flows of exotic gas mixtures in the chemical vapour deposition (CVD) and plasma etching. The exact amount of gas injected into the chamber is critically important to these processes and is regulated by a mass flow controller (MFC).

"Flow inaccuracies cause nonuniformities in critical features in wafers, directly causing yield reduction," said Mohamed Saleem, Chief Technology Officer at Brooks Instrument, a US MFC maker. "Factoring in the cost of running cleanrooms, the loss on a batch of wafers scrapped due to flow irregularities can run around $500,000 to $1,000,000. Add to that cost the process tool downtime required for troubleshooting, and it becomes prohibitively expensive."

Modern fabs rely on accurate gas flows controlled by MFCs which as typically calibrated using the "rate of rise" (RoR) method, which makes a series of pressure and temperature measurements over time as gas fills a collection tank through the MFC.

"Concerns about the accuracy of that technique came to our attention recently when a major manufacturer of chip-fabrication equipment found that they were getting inconsistent results for flow rate from their instruments when they were calibrated on different RoR systems," said John Wright of NIST's Fluid Metrology Group which conducted the error analysis.

No comments: