A Comparative Study of Tolerance Analysis Methods

Author(s):  
Zhengshu Shen ◽  
Gaurav Ameta ◽  
Jami J. Shah ◽  
Joseph K. Davidson

This paper reviews four major methods for tolerance analysis and compares them. The methods discussed are (1) 1D tolerance charts, (2) variational analysis based on Monte Carlo simulation, (3) vector loop (or kinematic) based analysis, and (4) ASU T-Maps© based tolerance analysis. Tolerance charts deal with tolerance analysis in one direction at a time and ignore possible contributions from the other directions. Manual charting is tedious and error-prone, hence attempts have been made for automation. Monte Carlo simulation based tolerance analysis is based on parametric solid modeling; its inherent drawback is that simulation results highly depend on the user-defined modeling scheme, and its inability to obey all Y14.5 rules. The vector loop method uses kinematic joints to model assembly constraints. It is also not fully consistent with Y14.5 standard. ASU T-Maps based tolerance analysis method can model geometric tolerances and their interaction in truly 3-dimensional context. It is completely consistent with Y14.5 standard but its use by designers may be quite challenging. T-Maps based tolerance analysis is still under development. Despite the shortcomings of each of these tolerance analysis methods, each may be used to provide reasonable results under certain circumstances. No guidelines exist for such a purpose. Through a comprehensive comparison of these methods, this paper will develop some guidelines for selecting the best method to use for a given tolerance accumulation problem.

2005 ◽  
Vol 5 (3) ◽  
pp. 247-256 ◽  
Author(s):  
Zhengshu Shen ◽  
Gaurav Ameta ◽  
Jami J. Shah ◽  
Joseph K. Davidson

This paper reviews four major methods for tolerance analysis and compares them. The methods discussed are: (1) one-dimensional tolerance charts; (2) parametric tolerance analysis, especially parametric analysis based on the Monte Carlo simulation; (3) vector loop (or kinematic) based tolerance analysis; and (4) ASU Tolerance-Map® (T-Map®) (Patent pending; nonprovisional patent application number: 09/507, 542 (2002)) based tolerance analysis. Tolerance charts deal with worst-case tolerance analysis in one direction at a time and ignore possible contributions from the other directions. Manual charting is tedious and error prone, hence, attempts have been made for automation. The parametric approach to tolerance analysis is based on parametric constraint solving; its inherent drawback is that the accuracy of the simulation results are dependent on the user-defined modeling scheme, and its inability to incorporate all Y14.5 rules. The vector loop method uses kinematic joints to model assembly constraints. It is also not fully consistent with Y14.5 standard. The ASU T-Map® based tolerance analysis method can model geometric tolerances and their interaction in truly three-dimensional context. It is completely consistent with Y14.5 standard but its use by designers may be quite challenging. The T-Map® based tolerance analysis method is still under development. Despite the shortcomings of each of these tolerance analysis methods, each may be used to provide reasonable results under certain circumstances. Through a comprehensive comparison of these methods, this paper will offer some recommendations for selecting the best method to use for a given tolerance accumulation problem.


Author(s):  
Jinsong Gao ◽  
Kenneth W. Chase ◽  
Spencer P. Magleby

Abstract Two methods for performing statistical tolerance analysis of mechanical assemblies are compared: the Direct Linearization Method (DLM), and Monte Carlo simulation. A selection of 2-D and 3-D vector models of assemblies were analyzed, including problems with closed loop assembly constraints. Closed vector loops describe the small kinematic adjustments that occur at assembly time. Open loops describe critical clearances or other assembly features. The DLM uses linearized assembly constraints and matrix algebra to estimate the variations of the assembly or kinematic variables, and to predict assembly rejects. A modified Monte Carlo simulation, employing an iterative technique for closed loop assemblies, was applied to the same problem set. The results of the comparison show that the DLM is accurate if the tolerances are relatively small compared to the nominal dimensions of the components, and the assembly functions are not highly nonlinear. Sample size is shown to have great influence on the accuracy of Monte Carlo simulation.


2021 ◽  
Vol 50 ◽  
pp. 101301
Author(s):  
A.Z. Zheng ◽  
S.J. Bian ◽  
E. Chaudhry ◽  
J. Chang ◽  
H. Haron ◽  
...  

2007 ◽  
Vol 129 ◽  
pp. 83-87
Author(s):  
Hua Long Li ◽  
Jong Tae Park ◽  
Jerzy A. Szpunar

Controlling texture and microstructure evolution during annealing processes is very important for optimizing properties of steels. Theories used to explain annealing processes are complicated and always case dependent. An recently developed Monte Carlo simulation based model offers an effective tool for studying annealing process and can be used to verify the arbitrarily defined theories that govern such processes. The computer model takes Orientation Image Microscope (OIM) measurements as an input. The abundant information contained in OIM measurement allows the computer model to incorporate many structural characteristics of polycrystalline materials such as, texture, grain boundary character, grain shape and size, phase composition, chemical composition, stored elastic energy, and the residual stress. The outputs include various texture functions, grain boundary and grain size statistics that can be verified by experimental results. Graphical representation allows us to perform virtual experiments to monitor each step of the structural transformation. An example of applying this simulation to Si steel is given.


Electronics ◽  
2021 ◽  
Vol 10 (22) ◽  
pp. 2881
Author(s):  
Muath Alrammal ◽  
Munir Naveed ◽  
Georgios Tsaramirsis

The use of innovative and sophisticated malware definitions poses a serious threat to computer-based information systems. Such malware is adaptive to the existing security solutions and often works without detection. Once malware completes its malicious activity, it self-destructs and leaves no obvious signature for detection and forensic purposes. The detection of such sophisticated malware is very challenging and a non-trivial task because of the malware’s new patterns of exploiting vulnerabilities. Any security solutions require an equal level of sophistication to counter such attacks. In this paper, a novel reinforcement model based on Monte-Carlo simulation called eRBCM is explored to develop a security solution that can detect new and sophisticated network malware definitions. The new model is trained on several kinds of malware and can generalize the malware detection functionality. The model is evaluated using a benchmark set of malware. The results prove that eRBCM can identify a variety of malware with immense accuracy.


Sign in / Sign up

Export Citation Format

Share Document