Process characterization and statistical analysis of oxide CMP on a silicon wafer with sparse data

2007 ◽  
Vol 88 (4) ◽  
pp. 785-792 ◽  
Author(s):  
S.T.S. Bukkapatnam ◽  
P.K. Rao ◽  
W.-C. Lih ◽  
N. Chandrasekaran ◽  
R. Komanduri
2014 ◽  
Vol 2014 (DPC) ◽  
pp. 000815-000829 ◽  
Author(s):  
Tom Strothmann ◽  
Damien Pricolo ◽  
Seung Wook Yoon ◽  
Yaojian Lin

The demand for Wafer Level Chip Scale Packages (WLCSP) has experienced tremendous growth due to the surge in demand for advanced mobile products. The increased demand is seen for both 200mm wafers and 300mm wafers, however a significant segment of the market continues to be driven by 200mm designs. The infrastructure capacity supporting 200mm WLCSP has been stressed as a result of the mature status of 200mm technology and the rate of conversion of alternative package formats to WLCSP. This creates a dilemma for WLP service providers because adding 200mm capacity continues to require a significant amount of capital. Since 200mm volumes will most likely decline within the next 5 years, it is difficult to justify the use of capital when the depreciation term is longer than the anticipated life cycle of the product. Conventional methods of manufacturing wafer level packages require the use of equipment specifically sized to a given silicon wafer diameter although there is no technical requirement to maintain the round silicon format. The conventional method has been beneficial since it leveraged equipment and processes developed for the IC industry, however the equipment is very expensive for larger wafer diameters and the fine geometries required in advanced node IC products are not required for wafer level packaging. The problem is serious for 200mm and 300mm wafer bump lines, however the capital equipment cost for a future 450mm bumping line may well be prohibitive for wafer level packaging. A new manufacturing method has been developed to produce a wafer level package that severs the link between wafer diameter and wafer level packaging methods. The new manufacturing method is wafer size agnostic, so one manufacturing module can produce fan-in, fan-out, and 3D fan-out products regardless of the incoming wafer size. The same bill of materials, manufacturing methods and manufacturing location can produce wafer level packages from any size silicon wafer. In this method the wafer is diced prior to processing and then the die are recombined into a uniform panel size. Once recombined into a panel format, the product is processed with conventional wafer level packing techniques, including dielectric deposition, metal plating and solder ball drop. Since the manufacturing module is wafer size agnostic, there is no risk of capital for investment in the manufacturing infrastructure. A change in loading between 200mm, 300mm, and 450mm wafers does not adversely affect the utilization of the manufacturing module. The process also enables new advanced wafer level packages otherwise unattainable with conventional manufacturing methods. This presentation will describe the new manufacturing module approach and the results of process characterization for products produced in the module.


Author(s):  
Jerome J. Dinglasan

Silicon wafer as a direct material is one of the vital parts of a semiconductor product. Wastages on manufacturing plants that pulls the yield down should be addressed innovatively and accurately. This paper focused on the phenomenon of broken wafers at wafer taping process during wafer preparation. Using a wafer taper machine, silicon wafers are covered by an industrial tape as preparation for the next process. During processing and wafers are placed on wafer boat, unexpected phenomenon of broken wafers due to unwanted falling was encountered. Findings was due to the unintentional dragging of the machine’s robot arm after wafer processing. The problem is resolved through simulation and experiments using statistical analysis. As a result, an optimized machine parameter setting was defined to eliminate the said rejection. Statistical analysis was of a big help in resolving the said phenomenon and improved the process yield of the manufacturing.


Author(s):  
I. I. Miroshnichenko ◽  
A. N. Simonov ◽  
I. I. Kuzmin ◽  
A. I. Platova

Performing pharmacokinetic and statistical analysis in the case of sparse data presents significant difficulties. Using the example of the pharmacokinetic (PK) study of resveratrol in mice, the resampling method was allowed us to obtain individual PK-parameters and perform full-fledged statistical tests.


1966 ◽  
Vol 24 ◽  
pp. 188-189
Author(s):  
T. J. Deeming

If we make a set of measurements, such as narrow-band or multicolour photo-electric measurements, which are designed to improve a scheme of classification, and in particular if they are designed to extend the number of dimensions of classification, i.e. the number of classification parameters, then some important problems of analytical procedure arise. First, it is important not to reproduce the errors of the classification scheme which we are trying to improve. Second, when trying to extend the number of dimensions of classification we have little or nothing with which to test the validity of the new parameters.Problems similar to these have occurred in other areas of scientific research (notably psychology and education) and the branch of Statistics called Multivariate Analysis has been developed to deal with them. The techniques of this subject are largely unknown to astronomers, but, if carefully applied, they should at the very least ensure that the astronomer gets the maximum amount of information out of his data and does not waste his time looking for information which is not there. More optimistically, these techniques are potentially capable of indicating the number of classification parameters necessary and giving specific formulas for computing them, as well as pinpointing those particular measurements which are most crucial for determining the classification parameters.


Author(s):  
Gianluigi Botton ◽  
Gilles L'espérance

As interest for parallel EELS spectrum imaging grows in laboratories equipped with commercial spectrometers, different approaches were used in recent years by a few research groups in the development of the technique of spectrum imaging as reported in the literature. Either by controlling, with a personal computer both the microsope and the spectrometer or using more powerful workstations interfaced to conventional multichannel analysers with commercially available programs to control the microscope and the spectrometer, spectrum images can now be obtained. Work on the limits of the technique, in terms of the quantitative performance was reported, however, by the present author where a systematic study of artifacts detection limits, statistical errors as a function of desired spatial resolution and range of chemical elements to be studied in a map was carried out The aim of the present paper is to show an application of quantitative parallel EELS spectrum imaging where statistical analysis is performed at each pixel and interpretation is carried out using criteria established from the statistical analysis and variations in composition are analyzed with the help of information retreived from t/γ maps so that artifacts are avoided.


Sign in / Sign up

Export Citation Format

Share Document