Diagnostic resolution improvement through learning-guided physical failure analysis

Author(s):  
Carlston Lim ◽  
Yang Xue ◽  
Xin Li ◽  
Ronald D. Blanton ◽  
M. Enamul Amyeen
2021 ◽  
Author(s):  
Kun Young Chung ◽  
Shaun Nicholson ◽  
Soumya Mittal ◽  
Martin Parley ◽  
Gaurav Veda ◽  
...  

Abstract In this paper, we present a diagnosis resolution improvement methodology for scan-based tests. We achieve 89% reduction in the number of suspect diagnosis locations and a 2.4X increase in the number of highly resolved diagnosis results. We suffer a loss in accuracy of 1.5%. These results were obtained from an extensive silicon study. We use data from pilot wafers and 11 other wafers at the leading-edge technology node and check against failure analysis results from 203 cases. This resolution improvement is achieved by considering the diagnosis problem at the level of a population (e.g. a wafer) of failing die instead of analyzing each failing die completely independently as has been done traditionally. Higher diagnosis resolution is critical for speeding up the yield learning from manufacturing test and failure analysis flows.


2018 ◽  
Author(s):  
Rommel Estores ◽  
Eric Barbian

Abstract ATPG diagnosis is an essential part in failure analysis and is proven to be an effective technique in isolating faults in the digital core. In many single failure cases however, ATPG diagnosis could yield either incorrect candidates or includes a large amount of equivalency which limits diagnostic resolution. While iterative ATPG diagnosis improves diagnostic resolution, there are many cases where the resolution is still insufficient. This paper will discuss a methodology that helps the analyst understand and complement ATPG diagnosis by using an approach called “single shot logic patterns”. New patterns that each target one singular fault in the area of interest provide the failure analyst with simplified analytical data. This process is repeated for each suspect candidate. The number of times a target fault is detected is increased for better resolution. Aggregating this analytical data with the layout and fan out of the net instances could provide greater resolution into the likely defective area. Furthermore, adding constraints can also be used to further simplify the test and/or control the fan out of failures. Only equivalencies where there is observable fan out can achieve greater diagnostic resolution. ATPG tools have been observed to not always maximize this fan out.


Author(s):  
John R. Devaney

Occasionally in history, an event may occur which has a profound influence on a technology. Such an event occurred when the scanning electron microscope became commercially available to industry in the mid 60's. Semiconductors were being increasingly used in high-reliability space and military applications both because of their small volume but, also, because of their inherent reliability. However, they did fail, both early in life and sometimes in middle or old age. Why they failed and how to prevent failure or prolong “useful life” was a worry which resulted in a blossoming of sophisticated failure analysis laboratories across the country. By 1966, the ability to build small structure integrated circuits was forging well ahead of techniques available to dissect and analyze these same failures. The arrival of the scanning electron microscope gave these analysts a new insight into failure mechanisms.


Author(s):  
Evelyn R. Ackerman ◽  
Gary D. Burnett

Advancements in state of the art high density Head/Disk retrieval systems has increased the demand for sophisticated failure analysis methods. From 1968 to 1974 the emphasis was on the number of tracks per inch. (TPI) ranging from 100 to 400 as summarized in Table 1. This emphasis shifted with the increase in densities to include the number of bits per inch (BPI). A bit is formed by magnetizing the Fe203 particles of the media in one direction and allowing magnetic heads to recognize specific data patterns. From 1977 to 1986 the tracks per inch increased from 470 to 1400 corresponding to an increase from 6300 to 10,800 bits per inch respectively. Due to the reduction in the bit and track sizes, build and operating environments of systems have become critical factors in media reliability.Using the Ferrofluid pattern developing technique, the scanning electron microscope can be a valuable diagnostic tool in the examination of failure sites on disks.


2010 ◽  
Vol 20 (2) ◽  
pp. 37-46
Author(s):  
Nicole M. Etter

Traditionally, speech-language pathologists (SLP) have been trained to develop interventions based on a select number of perceptual characteristics of speech without or through minimal use of objective instrumental and physiologic assessment measures of the underlying articulatory subsystems. While indirect physiological assumptions can be made from perceptual assessment measures, the validity and reliability of those assumptions are tenuous at best. Considering that neurological damage will result in various degrees of aberrant speech physiology, the need for physiologic assessments appears highly warranted. In this context, do existing physiological measures found in the research literature have sufficient diagnostic resolution to provide distinct and differential data within and between etiological classifications of speech disorders and versus healthy controls? The goals of this paper are (a) to describe various physiological and movement-related techniques available to objectively study various dysarthrias and speech production disorders and (b) to develop an appreciation for the need for increased systematic research to better define physiologic features of dysarthria and speech production disorders and their relation to know perceptual characteristics.


Sign in / Sign up

Export Citation Format

Share Document