Failure Analysis Laboratory Management Principles and Practices

Author(s):  
Richard J. Ross ◽  
Zhiyong Wang
2011 ◽  
Vol 51 (9-11) ◽  
pp. 1658-1661 ◽  
Author(s):  
R. Llido ◽  
J. Gomez ◽  
V. Goubier ◽  
N. Froidevaux ◽  
L. Dufayard ◽  
...  

Author(s):  
A. Firiti ◽  
D. Lewis ◽  
F. Beaudoin ◽  
P. Perdu ◽  
G. Haller ◽  
...  

Author(s):  
Jason M Benz ◽  
Kevin A Distelhurst ◽  
Douglas B Hunt ◽  
Rick Kontra

Abstract Many articles and books have been written that discuss and study the techniques of lean thinking and methodologies. The applications of these methodologies have included such industries as manufacturing, health care, and information technology. Application to analytical laboratories has been rare or non-existent due to the inability to apply lean methodologies to a process with ‘unique’ analytical work flows as well as a lack of a direct connection to the manufacturing value stream. The following paper describes the work done in a semiconductor failure analysis laboratory to visualize work flow, design a forecasting model, and create a management system. The result of which has been sustained and improved quality, resource utilization, and delivery of actionable root cause failure analysis.


1999 ◽  
Vol 5 (S2) ◽  
pp. 1346-1347
Author(s):  
Bruce L.Wong P.E.

Metallurgical failure analysis is generally straight forward in many cases. However, there are times when it is difficult to really discern the fracture characteristics due to surface degradation, improper handling, or subsequent mechanical damage from improper planning.Dr. Alan Tettleman, a founder of Failure Analysis and Associates, in San Jose, CA, once told me you can not properly conduct any product failure analysis without metallography, and especially without scanning electron microscopy. In many ways, I agree with him. There are obvious times, however, when various physical properties and deformations are apparent. Those are generally when a consultant is not required. As in your field, there are many references that assist the usage and implementation of the scanning electron microscope and energy dispersive x-ray analysis system. Many of the references introduce the procedure and some of the cautionary steps that should be taken in preparation of samples and in the interpretation of the fracture.


2003 ◽  
Vol 9 (S02) ◽  
pp. 774-775
Author(s):  
K.W. Lee ◽  
H.J. Park ◽  
Y.C. Wang ◽  
B.K. Park ◽  
Sean Da ◽  
...  

Author(s):  
William E. Vanderlinde ◽  
David A. Stoney

Abstract Optical microscopy techniques used by forensic analysts are shown to have application to failure analysis problems. Proper set up of the optical microscope is reviewed, including the correct use of the field diaphragm and the aperture diaphragm. Polarized light microscopy, bright and dark field methods, refractive index liquids, and a particle reference atlas are used to identify contamination found on semiconductor products.


Author(s):  
John R. Devaney

Occasionally in history, an event may occur which has a profound influence on a technology. Such an event occurred when the scanning electron microscope became commercially available to industry in the mid 60's. Semiconductors were being increasingly used in high-reliability space and military applications both because of their small volume but, also, because of their inherent reliability. However, they did fail, both early in life and sometimes in middle or old age. Why they failed and how to prevent failure or prolong “useful life” was a worry which resulted in a blossoming of sophisticated failure analysis laboratories across the country. By 1966, the ability to build small structure integrated circuits was forging well ahead of techniques available to dissect and analyze these same failures. The arrival of the scanning electron microscope gave these analysts a new insight into failure mechanisms.


Author(s):  
Evelyn R. Ackerman ◽  
Gary D. Burnett

Advancements in state of the art high density Head/Disk retrieval systems has increased the demand for sophisticated failure analysis methods. From 1968 to 1974 the emphasis was on the number of tracks per inch. (TPI) ranging from 100 to 400 as summarized in Table 1. This emphasis shifted with the increase in densities to include the number of bits per inch (BPI). A bit is formed by magnetizing the Fe203 particles of the media in one direction and allowing magnetic heads to recognize specific data patterns. From 1977 to 1986 the tracks per inch increased from 470 to 1400 corresponding to an increase from 6300 to 10,800 bits per inch respectively. Due to the reduction in the bit and track sizes, build and operating environments of systems have become critical factors in media reliability.Using the Ferrofluid pattern developing technique, the scanning electron microscope can be a valuable diagnostic tool in the examination of failure sites on disks.


Sign in / Sign up

Export Citation Format

Share Document