Data analytics and machine learning: root-cause problem-solving approach to prevent yield loss and quality issues in semiconductor industry for automotive applications

Author(s):  
Corinne Berges ◽  
Jim Bird ◽  
Mehul D. Shroff ◽  
Rene Rongen ◽  
Chris Smith
Author(s):  
Satish Kodali ◽  
Chen Zhe ◽  
Chong Khiam Oh

Abstract Nanoprobing is one of the key characterization techniques for soft defect localization in SRAM. DC transistor performance metrics could be used to identify the root cause of the fail mode. One such case report where nanoprobing was applied to a wafer impacted by significant SRAM yield loss is presented in this paper where standard FIB cross-section on hard fail sites and top down delayered inspection did not reveal any obvious defects. The authors performed nanoprobing DC characterization measurements followed by capacitance-voltage (CV) measurements. Two probe CV measurement was then performed between the gate and drain of the device with source and bulk floating. The authors identified valuable process marginality at the gate to lightly doped drain overlap region. Physical characterization on an inline split wafer identified residual deposits on the BL contacts potentially blocking the implant. Enhanced cleans for resist removal was implemented as a fix for the fail mode.


Author(s):  
J. N. C. de Luna ◽  
M. O. del Fierro ◽  
J. L. Muñoz

Abstract An advanced flash bootblock device was exceeding current leakage specifications on certain pins. Physical analysis showed pinholes on the gate oxide of the n-channel transistor at the input buffer circuit of the affected pins. The fallout contributed ~1% to factory yield loss and was suspected to be caused by electrostatic discharge or ESD somewhere in the assembly and test process. Root cause investigation narrowed down the source to a charged core picker inside the automated test equipment handlers. By using an electromagnetic interference (EMI) locator, we were able to observe in real-time the high amplitude electromagnetic pulse created by this ESD event. Installing air ionizers inside the testers solved the problem.


Author(s):  
Sadaf Qazi ◽  
Muhammad Usman

Background: Immunization is a significant public health intervention to reduce child mortality and morbidity. However, its coverage, in spite of free accessibility, is still very low in developing countries. One of the primary reasons for this low coverage is the lack of analysis and proper utilization of immunization data at various healthcare facilities. Purpose: In this paper, the existing machine learning based data analytics techniques have been reviewed critically to highlight the gaps where this high potential data could be exploited in a meaningful manner. Results: It has been revealed from our review, that the existing approaches use data analytics techniques without considering the complete complexity of Expanded Program on Immunization which includes the maintenance of cold chain systems, proper distribution of vaccine and quality of data captured at various healthcare facilities. Moreover, in developing countries, there is no centralized data repository where all data related to immunization is being gathered to perform analytics at various levels of granularities. Conclusion: We believe that the existing non-centralized immunization data with the right set of machine learning and Artificial Intelligence based techniques will not only improve the vaccination coverage but will also help in predicting the future trends and patterns of its coverage at different geographical locations.


Author(s):  
William B. Rouse

This book discusses the use of models and interactive visualizations to explore designs of systems and policies in determining whether such designs would be effective. Executives and senior managers are very interested in what “data analytics” can do for them and, quite recently, what the prospects are for artificial intelligence and machine learning. They want to understand and then invest wisely. They are reasonably skeptical, having experienced overselling and under-delivery. They ask about reasonable and realistic expectations. Their concern is with the futurity of decisions they are currently entertaining. They cannot fully address this concern empirically. Thus, they need some way to make predictions. The problem is that one rarely can predict exactly what will happen, only what might happen. To overcome this limitation, executives can be provided predictions of possible futures and the conditions under which each scenario is likely to emerge. Models can help them to understand these possible futures. Most executives find such candor refreshing, perhaps even liberating. Their job becomes one of imagining and designing a portfolio of possible futures, assisted by interactive computational models. Understanding and managing uncertainty is central to their job. Indeed, doing this better than competitors is a hallmark of success. This book is intended to help them understand what fundamentally needs to be done, why it needs to be done, and how to do it. The hope is that readers will discuss this book and develop a “shared mental model” of computational modeling in the process, which will greatly enhance their chances of success.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 831
Author(s):  
Vaneet Aggarwal

Due to the proliferation of applications and services that run over communication networks, ranging from video streaming and data analytics to robotics and augmented reality, tomorrow’s networks will be faced with increasing challenges resulting from the explosive growth of data traffic demand with significantly varying performance requirements [...]


Author(s):  
G. Arunakranthi ◽  
B. Rajkumar ◽  
V. Chandra Shekhar Rao ◽  
A. Harshavardhan

Sign in / Sign up

Export Citation Format

Share Document