Volume 6: 33rd Design Automation Conference, Parts A and B
Latest Publications


TOTAL DOCUMENTS

125
(FIVE YEARS 0)

H-INDEX

7
(FIVE YEARS 0)

Published By ASMEDC

0791848078

Author(s):  
Chaitanya Vempati ◽  
Matthew I. Campbell

Neural networks are increasingly becoming a useful and popular choice for process modeling. The success of neural networks in effectively modeling a certain problem depends on the topology of the neural network. Generating topologies manually relies on previous neural network experience and is tedious and difficult. Hence there is a rising need for a method that generates neural network topologies for different problems automatically. Current methods such as growing, pruning and using genetic algorithms for this task are very complicated and do not explore all the possible topologies. This paper presents a novel method of automatically generating neural networks using a graph grammar. The approach involves representing the neural network as a graph and defining graph transformation rules to generate the topologies. The approach is simple, efficient and has the ability to create topologies of varying complexity. Two example problems are presented to demonstrate the power of our approach.


Author(s):  
Hiroshi Masuda ◽  
Kenta Ogawa

Mesh deformation, which is sometimes referred to as mesh morphing in CAE, is useful for providing various shapes of meshes for CAE tools. This paper proposes a new framework for interactively and consistently deforming assembly models of sheet structure for mechanical parts. This framework is based on a surface-based deformation, which calculates the vertex positions so that the mean curvature normal is preserved at each vertex in a least squares sense. While existing surface-based deformation techniques cannot simultaneously deform assembly mesh models, our method allows us to smoothly deform disconnected meshes by propagating the rotations and translations through disconnected vertices. In addition, we extend our deformation technique to handle non-manifold conditions, because shell structure models may include non-manifold edges. We have applied our method to assembly mesh models of automobile parts. Our experimental results have shown that our method requires almost the same pre-processing time as existing methods and can deform practical assembly models interactively.


Author(s):  
Santosh Tiwari ◽  
Joshua Summers ◽  
Georges Fadel

A novel approach using a genetic algorithm is presented for extracting globally satisfycing (Pareto optimal) solutions from a morphological chart where the evaluation and combination of “means to sub-functions” is modeled as a combinatorial multi-objective optimization problem. A fast and robust genetic algorithm is developed to solve the resulting optimization problem. Customized crossover and mutation operators specifically tailored to solve the combinatorial optimization problem are discussed. A proof-of-concept simulation on a practical design problem is presented. The described genetic algorithm incorporates features to prevent redundant evaluation of identical solutions and a method for handling of the compatibility matrix (feasible/infeasible combinations) and addressing desirable/undesirable combinations. The proposed approach is limited by its reliance on the quantifiable metrics for evaluating the objectives and the existence of a mathematical representation of the combined solutions. The optimization framework is designed to be a scalable and flexible procedure which can be easily modified to accommodate a wide variety of design methods that are based on the morphological chart.


Author(s):  
Seung Ki Moon ◽  
Timothy W. Simpson ◽  
Soundar R. T. Kumara

Product family design is a cost-effective way to achieve mass customization by allowing highly differentiated products to be developed from a common platform while targeting individual products to distinct market segments. Recent trends seek to apply and extend principles from product family design to new service development. In this paper, we extend concepts from platform-based product family design to create a novel methodology for module-based service family design. The new methodology helps identify a service platform along with variant and unique modules in a service family by integrating service-based process analysis, ontologies, and data mining. A function-process matrix and a service process model are investigated to define the relationships between the service functions and the service processes offered as part of a service. An ontology is used to represent the relationships between functional hierarchies in a service. Fuzzy clustering is employed to partition service processes into subsets for identifying modules in a given service family. The clustering result identifies the platform and its modules using a platform level membership function. We apply the proposed methodology to determine a new platform using a case study involving a family of banking services.


Author(s):  
Lee J. Wells ◽  
Byeng D. Youn ◽  
Zhimin Xi

This paper presents an innovative approach for quality engineering using the Eigenvector Dimension Reduction (EDR) Method. Currently industry relies heavily upon the use of the Taguchi method and Signal to Noise (S/N) ratios as quality indices. However, some disadvantages of the Taguchi method exist such as, its reliance upon samples occurring at specified levels, results to be valid at only the current design point, and its expensiveness to maintain a certain level of confidence. Recently, it has been shown that the EDR method can accurately provide an analysis of variance, similar to that of the Taguchi method, but is not hindered by the aforementioned drawbacks of the Taguchi method. This is evident because the EDR method is based upon fundamental statistics, where the statistical information for each design parameter is used to estimate the uncertainty propagation through engineering systems. Therefore, the EDR method provides much more extensive capabilities than the Taguchi method, such as the ability to estimate not only mean and standard deviation of the response, but also the skewness and kurtosis. The uniqueness of the EDR method is its ability to generate the probability density function (PDF) of system performances. This capability, known as the probabilistic “what-if” study, provides a visual representation of the effects of the design parameters (e.g., its mean and variance) upon the response. In addition, the probabilistic “what-if” study can be applied across multiple design parameters, allowing the analysis of interactions among control factors. Furthermore, the implementation of the probabilistic “what-if” study provides a basis for performing robust design optimization. Because of these advantages, it is apparent that the EDR method provides an alternative platform of quality engineering to the Taguchi method. For easy execution by field engineers, the proposed platform for quality engineering using the EDR method, known as Quick Quality Quantification (Q3), will be developed as a Microsoft EXCEL add-in.


Author(s):  
Jing Han ◽  
Koetsu Yamazaki ◽  
Sadao Nishiyama ◽  
Ryoichi Itoh

This paper has introduced the finite element analysis (FEA) into the ergonomic design to evaluate the human feelings numerically and objectively, and then into the optimization design of beverage containers considering human factors. In the design of the end of can (the lid of can), experiments and the FEA of indenting vertically the fingertip pulp by a probe and the tab of end have been done to observe force responses and to study feelings in the fingertip. A numerical simulation of finger lifting the tab for opening the can has also been performed, and discomfort in the fingertip has been evaluated numerically to present the finger-accessibility of the tab. The comparison of finger-accessibility between two kinds of tab ring shape designs showed that the tab that may have a larger contact area with the finger is better. In the design of beverage bottles served hot drinks, the FEA of tactile sensation of heat has been performed to evaluate numerically the touch feeling of the finger when holding the hot bottle. The numerical simulations of embossing process have also been performed to evaluate the formability of various rib-shape designs. The optimum design has then been done considering the hot touch feeling as well as the metal sheet formability.


Author(s):  
Nuogang Sun ◽  
Youyun Zhang ◽  
Xuesong Mei

Faithfully obtaining design specifications from customer requirements is essential for successful designs. The natural lingual, inexact, incomplete and vague attributes of customer requirements make it very difficult to map customer requirements to design specifications. In general design process, the design specifications are determined by designers based on their experience and intuition, and often a certain target value is set for a specification. However, it is on one hand very difficult, on the other hand unreasonable, so a suitable limit range rather than a certain value is preferred at the beginning of design, especially at the concept design process. In this paper, a simplified systematic approach of transforming customer requirements to design specifications is proposed. First, a two-stepped clustering approach for grouping customer requirements and design specifications based on HOQ matrix is presented, by which the mapping is limited to within each group. To further simplify the inference mapping rules of customer requirements and design specifications, the minimal condition inference mapping rules for each design specification are extracted based on rough set theory. In the end, a suitable value range is determined for a specification by applying the fuzzy rule matrix.


Author(s):  
Marcus Pettersson ◽  
Johan O¨lvander

Box’s Complex method for direct search has shown promise when applied to simulation based optimization. In direct search methods, like Box’s Complex method, the search starts with a set of points, where each point is a solution to the optimization problem. In the Complex method the number of points must be at least one plus the number of variables. However, in order to avoid premature termination and increase the likelihood of finding the global optimum more points are often used at the expense of the required number of evaluations. The idea in this paper is to gradually remove points during the optimization in order to achieve an adaptive Complex method for more efficient design optimization. The proposed method shows encouraging results when compared to the Complex method with fix number of points and a quasi-Newton method.


Author(s):  
Yuan Mao Huang ◽  
Kuo Juei Wang

A bicycle frame is optimized for the lightest weight by using genetic algorithms in this study. Stresses of five rods in the bicycle frame less than the material yielding strength with consideration of the factor of safety are the constraints. A two-dimensional model of the frame is created. Equilibrium equations are derived and loads acting on rods are determined. A known function is used to verify feasibility of the program generated. Effects of the mutation rate, the crossover rate and the number of generation on the mean and the standard deviation of the fitness value are studied. The optimal solutions with the outer diameters and the inner diameters of the front frame rods to be 0.040 m and 0.038 m, respectively, the outer diameters and the inner diameters of the rear frame rods to be 0.024 m and 0.021m, respectively, and the weight of the bicycle frame to be 0.896 kg are recommended for the bicycle frame.


Author(s):  
R. J. Yang ◽  
G. Li ◽  
Y. Fu

This research addresses the development of validation metrics for vehicle frontal impact simulation. The model validation metrics provide a quantified measurement of the difference between CAE simulation and physical test. They are useful to develop an objective model evaluation procedure for eventually achieving the goal of zero or near zero prototyping. In this research, full frontal crash pulses are chosen as the key items to be compared in the vehicle frontal impact simulation. Both physics- and mathematics-based metrics are investigated. The physics-based metric include a method of using a simplified step function representation and the mathematics-based metrics include methods of wavelet decomposition, corridor violation plus area, and metrics used in a commercial code ADVISER, respectively. They are all correlated to subject matter experts’ rating through optimal weightings. A new metric, considering variabilities from both experts and metrics for frontal crash pulse, is proposed. One example is used to demonstrate its application.


Sign in / Sign up

Export Citation Format

Share Document