An integrated CAD system for algorithm-specific IC design

Author(s):  
C.S. Shung ◽  
R. Jain ◽  
K. Rimey ◽  
E. Wang ◽  
M.B. Srivastava ◽  
...  
Keyword(s):  
1987 ◽  
Vol 70 (3) ◽  
pp. 106-114
Author(s):  
Hideki Koide ◽  
Chol Kim ◽  
Katsufusa Shono
Keyword(s):  

Author(s):  
C.B. Shung ◽  
R. Jain ◽  
K. Rimey ◽  
E. Wang ◽  
M.B. Srivastava ◽  
...  
Keyword(s):  

Author(s):  
V. A. Martynyuk ◽  
V. A. Trudonoshin ◽  
V. G. Fedoruk

The article considers applications of foreign CAD-systems in creating the challenging projects at domestic enterprises and design bureaus. As stated in the article "... presently, there is no domestic CAD-system that could completely replace such foreign products as NX, CATIA, Credo". Besides, due to international cooperation in creating the challenging projects (for example, the project to create a modern wide-body aircraft, proposed jointly with China), it makes sense to use the worldwide known and popular CAD systems (the aforementioned NX, CATIA, Credo). Therefore, in the foreseeable future, we will still have to use foreign software products. Of course, there always remains a question of the reliability of the results obtained. Actually, this question is always open regardless of what software product is used - domestic or foreign. This question has been haunting both developers and users of CAD systems for the last 30 to 40 years. But with using domestic systems, it is much easier to identify the cause of inaccurate results and correct the mathematical models used, the methods of numerical integration applied, and the solution of systems of nonlinear algebraic systems. Everything is much more complicated if we use a foreign software product. All advertising conversations that there is a tool to make the detected errors available to the developers, remain only conversations in the real world. It is easily understandable to domestic users, and, especially, to domestic developers of similar software products. The existing development rates and competition for potential buyers dictate a rigid framework of deadlines for releasing all new versions of the product and introducing the latest developments into commercial product, etc. As a result, the known errors migrate from version to version, and many users have accepted it long ago. Especially, this concerns the less popular tools rather than the most popular applications (modules) of a CAD system. For example, in CAD systems, the "Modeling" module where geometric models of designed parts and assembly units are created has been repeatedly crosschecked. But most of the errors are hidden in applications related to the design of parts from sheet material and to the pipeline design, as well as in applications related to the analysis of moving mechanisms and to the strength or gas dynamic analysis by the finite element method.The article gives a concrete example of a moving mechanism in the analysis of which an error was detected using the mathematical model of external influence (a source of speed) in the NX 10.0 system of Siemens.


Author(s):  
H.H. Yap ◽  
P.K. Tan ◽  
G.R. Low ◽  
M.K. Dawood ◽  
H. Feng ◽  
...  

Abstract With technology scaling of semiconductor devices and further growth of the integrated circuit (IC) design and function complexity, it is necessary to increase the number of transistors in IC’s chip, layer stacks, and process steps. The last few metal layers of Back End Of Line (BEOL) are usually very thick metal lines (>4μm thickness) and protected with hard Silicon Dioxide (SiO2) material that is formed from (TetraEthyl OrthoSilicate) TEOS as Inter-Metal Dielectric (IMD). In order to perform physical failure analysis (PFA) on the logic or memory, the top thick metal layers must be removed. It is time-consuming to deprocess those thick metal and IMD layers using conventional PFA workflows. In this paper, the Fast Laser Deprocessing Technique (FLDT) is proposed to remove the BEOL thick and stubborn metal layers for memory PFA. The proposed FLDT is a cost-effective and quick way to deprocess a sample for defect identification in PFA.


Author(s):  
Steve Ferrier ◽  
Kevin D. Martin ◽  
Donald Schulte

Abstract Application of a formal Failure Analysis metaprocess to a stubborn yield loss problem provided a framework that ultimately facilitated a solution. Absence of results from conventional failure analysis techniques such as PEM (Photon Emission Microscopy) and liquid crystal microthermography frustrated early attempts to analyze this low-level supply leakage failure mode. Subsequently, a reorganized analysis team attacked the problem using a specific toplevel metaprocess.(1,a) Using the metaprocess, analysts generated a specific unique step-by-step analysis process in real time. Along the way, this approach encouraged the creative identification of secondary failure effects that provided repeated breakthroughs in the analysis flow. Analysis proceeded steadily toward the failure cause in spite of its character as a three-way interaction among factors in the IC design, mask generation, and wafer manufacturing processes. The metaprocess also provided the formal structure that, at the conclusion of the analysis, permitted a one-sheet summary of the failure's cause-effect relationships and the analysis flow leading to discovery of the anomaly. As with every application of this metaprocess, the resulting analysis flow simply represented an effective version of good failure analysis. The formal and flexible codification of the analysis decision-making process, however, provided several specific benefits, not least of which was the ability to proceed with high confidence that the problem could and would be solved. This paper describes the application of the metaprocess, and also the key measurements and causeeffect relationships in the analysis.


1998 ◽  
Author(s):  
S. M. Kang ◽  
E. Rosenbaum ◽  
Y. K. Cheng ◽  
L. P. Yuan ◽  
T. Li

Sign in / Sign up

Export Citation Format

Share Document