Yield/cost modeling for electronics wafer fabrication and evaluation of the impact of minimum acceptable die yield criteria on statistical wafer and die yields and costs

Author(s):  
C.A. D'Cruz
Author(s):  
Amy Lujan

In recent years, the possibility of panels replacing wafers in some fan-out applications has been a topic of interest. Questions of cost and yield continue to arise even as the industry appears to be full steam ahead. While large panels allow for more packages to be produced at once, the cost does not scale simply based on how many more packages can be generated from a panel over a wafer. This analysis begins by breaking down the types of cost and will discuss how those types of cost are impacted (or not) by the shift from wafer to panel. Activity based cost modeling is used; this is a detailed, bottom-up approach that takes into account each type of cost for each activity in a process flow. Two complete cost models were constructed for this analysis. A variety of package sizes are analyzed, and multiple panel sizes are included as well. For each set of activities in the fan-out process flow, there is an explanation of how the process changes with the move to panel, including assumptions related to throughput, equipment price, and materials. The cost reduction that may be achieved at each package and panel size will be presented for each processing segment. The focus of this analysis is on the details of each segment of the process flow, but results for the total cost of various packages will also be presented. There is also a section of analysis related to the impact of yield on the competitiveness of panel processing.


2012 ◽  
Vol 2012 (1) ◽  
pp. 000012-000017
Author(s):  
Chet Palesko ◽  
Alan Palesko

Demands on the electronics industry for smaller, better, and cheaper packages have made the supply chain more complex. Outsourcing, new technologies, and increasing performance requirements make designing and building the right product for the right price more difficult than ever. We will present a framework for understanding and managing the supply chain through cost modeling. Cost models that accurately reflect the cost impact from technology and design decisions enable a more precise understanding of supply chain behavior. Cost models can show the extra cost of adding a layer, the expected savings from relaxing design rules, or the cost of package on package assembly compared to 3D packaging with through silicon vias (TSVs). The models also provide context to understanding the ″should cost″ of a product and the path to achieving it. Since the guidance from cost models is based on the actual supplier cost drivers and pricing behavior, designer cost reduction efforts will result in higher savings compared to not using the cost models. Without cost models, designers risk missing their suppliers' real cost drivers and, therefore, the opportunity to decrease cost. This cost modeling framework allows the designers to realize the lowest cost product by matching the right design with the right supplier. It is a method for understanding a design decision's cost impact: a design change, a supplier change, or even the impact of new technology.


2011 ◽  
Vol 473 ◽  
pp. 452-459 ◽  
Author(s):  
M.S. Aydin ◽  
Aykut Canpolat ◽  
Jörg Gerlach ◽  
Lutz Kessler ◽  
A. Erman Tekkaya

Recently, an alternative inverse-analysis approach was proposed to obtain the material parameters of the advanced yield criteria by employing tensile and cup drawing tests [1]. In this paper, the applicability of this strategy will be investigated for a mild steel grade by means of cruciform, plane strain tension and hydraulic bulge tests. Other than this, the impact of the strain rate on the hydraulic bulge tests will be one another aspect of this work.


2020 ◽  
Vol 245 ◽  
pp. 03014
Author(s):  
Catherine Biscarat ◽  
Tommaso Boccali ◽  
Daniele Bonacorsi ◽  
Concezio Bozzi ◽  
Davide Costanzo ◽  
...  

The increase in the scale of LHC computing during Run 3 and Run 4 (HL-LHC) will certainly require radical changes to the computing models and the data processing of the LHC experiments. The working group established by WLCG and the HEP Software Foundation to investigate all aspects of the cost of computing and how to optimise them has continued producing results and improving our understanding of this process. In particular, experiments have developed more sophisticated ways to calculate their resource needs, we have a much more detailed process to calculate infrastructure costs. This includes studies on the impact of HPC and GPU based resources on meeting the computing demands. We have also developed and perfected tools to quantitatively study the performance of experiments workloads and we are actively collaborating with other activities related to data access, benchmarking and technology cost evolution. In this contribution we expose our recent developments and results and outline the directions of future work.


Author(s):  
Paul P. Mehta ◽  
Jerry W. Evans ◽  
Arthur L. Ludwig

The Integrated High Performance Turbine Engine Technology (IHPTET) is a joint Air Force, Navy, Army, DARPA, NASA, and industry initiative focused on developing higher performance turbine engines. The goal of IHPTET is to develop and demonstrate propulsion systems that would, by the turn of the century, double propulsion capability (1987 base year). For this reason, IHPTET engines are now test beds for a high number of advanced composites, intermetallics and single crystal alloys. While satisfying the performance requirements, the program has another salient objective of cost reduction. This paper will discuss an approach for cost estimating and modeling of components and sub-components, and demonstrate the benefits of employing simulations. Traditional approaches have relied on comparative techniques utilizing complexity factors or Cost Estimating Relationships (CER’s). For advanced materials, these approaches are inadequate, primarily due to non-existence of historic data. A further weakness is their inability to identify cost drivers and quantify cost avoidance potential. Process-oriented cost estimating, albeit cumbersome during build-up and requiring detailed knowledge of manufacturing and process technology, provides a stable foundation for development of a comprehensive cost modeling system. Manufacturing Process Flow Simulation (MPFS) aids in evaluating evolving manufacturing processes (in infancy), studying the impact of alternate manufacturing processes and conducting what-if studies. MPFS can then be incorporated selectively into a cost modeling architecture capable of evaluating production cost for sub-components, components, and complete engines.


1962 ◽  
Vol 14 ◽  
pp. 415-418
Author(s):  
K. P. Stanyukovich ◽  
V. A. Bronshten

The phenomena accompanying the impact of large meteorites on the surface of the Moon or of the Earth can be examined on the basis of the theory of explosive phenomena if we assume that, instead of an exploding meteorite moving inside the rock, we have an explosive charge (equivalent in energy), situated at a certain distance under the surface.


1962 ◽  
Vol 14 ◽  
pp. 169-257 ◽  
Author(s):  
J. Green

The term geo-sciences has been used here to include the disciplines geology, geophysics and geochemistry. However, in order to apply geophysics and geochemistry effectively one must begin with a geological model. Therefore, the science of geology should be used as the basis for lunar exploration. From an astronomical point of view, a lunar terrain heavily impacted with meteors appears the more reasonable; although from a geological standpoint, volcanism seems the more probable mechanism. A surface liberally marked with volcanic features has been advocated by such geologists as Bülow, Dana, Suess, von Wolff, Shaler, Spurr, and Kuno. In this paper, both the impact and volcanic hypotheses are considered in the application of the geo-sciences to manned lunar exploration. However, more emphasis is placed on the volcanic, or more correctly the defluidization, hypothesis to account for lunar surface features.


1997 ◽  
Vol 161 ◽  
pp. 197-201 ◽  
Author(s):  
Duncan Steel

AbstractWhilst lithopanspermia depends upon massive impacts occurring at a speed above some limit, the intact delivery of organic chemicals or other volatiles to a planet requires the impact speed to be below some other limit such that a significant fraction of that material escapes destruction. Thus the two opposite ends of the impact speed distributions are the regions of interest in the bioastronomical context, whereas much modelling work on impacts delivers, or makes use of, only the mean speed. Here the probability distributions of impact speeds upon Mars are calculated for (i) the orbital distribution of known asteroids; and (ii) the expected distribution of near-parabolic cometary orbits. It is found that cometary impacts are far more likely to eject rocks from Mars (over 99 percent of the cometary impacts are at speeds above 20 km/sec, but at most 5 percent of the asteroidal impacts); paradoxically, the objects impacting at speeds low enough to make organic/volatile survival possible (the asteroids) are those which are depleted in such species.


1997 ◽  
Vol 161 ◽  
pp. 189-195
Author(s):  
Cesare Guaita ◽  
Roberto Crippa ◽  
Federico Manzini

AbstractA large amount of CO has been detected above many SL9/Jupiter impacts. This gas was never detected before the collision. So, in our opinion, CO was released from a parent compound during the collision. We identify this compound as POM (polyoxymethylene), a formaldehyde (HCHO) polymer that, when suddenly heated, reformes monomeric HCHO. At temperatures higher than 1200°K HCHO cannot exist in molecular form and the most probable result of its decomposition is the formation of CO. At lower temperatures, HCHO can react with NH3 and/or HCN to form high UV-absorbing polymeric material. In our opinion, this kind of material has also to be taken in to account to explain the complex evolution of some SL9 impacts that we observed in CCD images taken with a blue filter.


Sign in / Sign up

Export Citation Format

Share Document