Approach for the Automated Analysis of Geometrical Clinch Joint Characteristics

2021 ◽  
Vol 883 ◽  
pp. 105-110
Author(s):  
Christoph Zirngibl ◽  
Benjamin Schleich

Due to their cost-efficiency and environmental friendliness, the demand of mechanical joining processes is constantly rising. However, the dimensioning and design of joints and suitable processes are mainly based on expert knowledge and few experimental data. Therefore, the performance of numerical and experimental studies enables the generation of optimized joining geometries. However, the manual evaluation of the results of such studies is often highly time-consuming. As a novel solution, image segmentation and machine learning algorithm provide methods to automate the analysis process. Motivated by this, the paper presents an approach for the automated analysis of geometrical characteristics using clinching as an example.

2021 ◽  
pp. 002224292199708
Author(s):  
Raji Srinivasan ◽  
Gülen Sarial-Abi

Algorithms increasingly used by brands sometimes fail to perform as expected or even worse, cause harm, causing brand harm crises. Unfortunately, algorithm failures are increasing in frequency. Yet, we know little about consumers’ responses to brands following such brand harm crises. Extending developments in the theory of mind perception, we hypothesize that following a brand harm crisis caused by an algorithm error (vs. human error), consumers will respond less negatively to the brand. We further hypothesize that consumers’ lower mind perception of agency of the algorithm (vs. human) for the error that lowers their perceptions of the algorithm’s responsibility for the harm caused by the error will mediate this relationship. We also hypothesize four moderators of this relationship: two algorithm characteristics, anthropomorphized algorithm and machine learning algorithm and two task characteristics where the algorithm is deployed, subjective (vs. objective) task and interactive (vs. non-interactive) task. We find support for the hypotheses in eight experimental studies including two incentive-compatible studies. We examine the effects of two managerial interventions to manage the aftermath of brand harm crises caused by algorithm errors. The research’s findings advance the literature on brand harm crises, algorithm usage, and algorithmic marketing and generate managerial guidelines to address the aftermath of such brand harm crises.


Author(s):  
Sahana Apparsamy ◽  
Kamalanand Krishnamurthy

Soft tissues are non-homogeneous deformable structures having varied structural arrangements, constituents, and composition. This chapter explains the design of a capacitance sensor array for analyzing and imaging the non-homogeneity in biological materials. Further, tissue mimicking phantoms are developed using Agar-Agar and Polyacrylamide gels for testing the developed sensor. Also, the sensor employs an unsupervised learning algorithm for automated analysis of non-homogeneity. The reconstructed capacitance image can also be sensitive to topographical and morphological variations in the sample. The proposed method is further validated using a fiberoptic-based laser imaging system and the Jaccard index. In this chapter, the design of the sensor array for smart analysis of non-homogeneity along with significant results are presented in detail.


Author(s):  
H. Willems ◽  
K. Reber ◽  
M. Zo¨llner ◽  
M. Ziegenmeyer

Inline inspection of pipelines by means of intelligent pigs usually results in large amounts of data that are analyzed offline by human experts. In order to increase the reliability of the data analysis process as well as to speed up analysis times methods of artificial intelligence such as neural networks have been used in the past with more or less success. The basic requirement for any technique to be used in practice is that no relevant features should be overlooked while keeping the false call rate as low as possible. For the task of automated analysis of in-line inspection data obtained from ultrasonic metal loss inspections, we have developed a two-stage approach. In a first step (called boxing), any defect candidates exceeding the specified size limits are recognized and described by a surrounding box. In the second step, all boxes from step 1 are analyzed yielding basically a relevant/non relevant decision. Each feature considered to be relevant is then classified according to a given set of feature classes. In order to efficiently perform step 2, we have adapted the SVM (support vector machines) algorithm which offers some important advantages compared to, for example, neural networks. We describe the approach applied, and examples as obtained from in-line inspection data are presented.


2021 ◽  
Vol 263 ◽  
pp. 04063
Author(s):  
Nadezhda Sevryugina ◽  
Pavel Kapyrin

The concept of a multidisciplinary approach to assessing the resource of individual components of the machine by combining database information based on simulation techniques and functional tensomethration is proposed. Simulations determine the reperative points of the tendometric sensors. The creation of a diagnostic model using basic concepts of information theory has allowed the development of a synergistic model for the recognition of the area of displacement of areas of uncertainty, which will ensure the identification of the defect (risk-denial). The formation of an electronic database of parametric data on the nature of loads as a diagnostic indicator of the change in the accuracy of pairing in machine systems is justified. Experimental studies were conducted on the model of the quick-capler. Hierarchical structuring of the machine to the level of mating parts with digital control of the criticality of the magnitude of external and internal loads ensures reliability control throughout the entire service life of the machine. When disposing of machines, this data allows you to obtain information about the residual resources of the elements for their reuse or the feasibility of restoration. This, in turn, will ensure the environmental friendliness and economy of the process.


2020 ◽  
Author(s):  
Yutao Lu ◽  
Juan Wang ◽  
Miao Liu ◽  
Kaixuan Zhang ◽  
Guan Gui ◽  
...  

The ever-increasing amount of data in cellular networks poses challenges for network operators to monitor the quality of experience (QoE). Traditional key quality indicators (KQIs)-based hard decision methods are difficult to undertake the task of QoE anomaly detection in the case of big data. To solve this problem, in this paper, we propose a KQIs-based QoE anomaly detection framework using semi-supervised machine learning algorithm, i.e., iterative positive sample aided one-class support vector machine (IPS-OCSVM). There are four steps for realizing the proposed method while the key step is combining machine learning with the network operator's expert knowledge using OCSVM. Our proposed IPS-OCSVM framework realizes QoE anomaly detection through soft decision and can easily fine-tune the anomaly detection ability on demand. Moreover, we prove that the fluctuation of KQIs thresholds based on expert knowledge has a limited impact on the result of anomaly detection. Finally, experiment results are given to confirm the proposed IPS-OCSVM framework for QoE anomaly detection in cellular networks.


2011 ◽  
Vol 374-377 ◽  
pp. 339-345
Author(s):  
Jun Mu ◽  
Tie Gang Zhou

After the earthquake in Sichuan of China in 2008, price increase of conventional materials, poor conditions in transportation and resources and low levels of economy and technology are the main challenges faced by villagers’ home rebuilding. Under this background, a comprehensive demonstration village rebuild project was launched. Based on locally available natural resources and recyclable materials from collapsed houses, the local traditional earth-based technology got greatly improved in anti-seismic performance via experimental studies. By prototype-based training, the villagers quickly mastered these upgraded techniques and rebuilt homes by themselves with minimized input of manpower and material cost. Their new earth houses show a far better performance in cost-efficiency, sustainability and comfort than local conventional rebuilt dwellings. It demonstrated and conveyed an affordable, ecological, healthy and humane way of post-quake village rebuild that local villagers could take, own and pass on.


Author(s):  
Qingsong Xu

Extreme learning machine (ELM) is a learning algorithm for single-hidden layer feedforward neural networks. In theory, this algorithm is able to provide good generalization capability at extremely fast learning speed. Comparative studies of benchmark function approximation problems revealed that ELM can learn thousands of times faster than conventional neural network (NN) and can produce good generalization performance in most cases. Unfortunately, the research on damage localization using ELM is limited in the literature. In this chapter, the ELM is extended to the domain of damage localization of plate structures. Its effectiveness in comparison with typical neural networks such as back-propagation neural network (BPNN) and least squares support vector machine (LSSVM) is illustrated through experimental studies. Comparative investigations in terms of learning time and localization accuracy are carried out in detail. It is shown that ELM paves a new way in the domain of plate structure health monitoring. Both advantages and disadvantages of using ELM are discussed.


Robotica ◽  
2019 ◽  
Vol 38 (9) ◽  
pp. 1558-1575
Author(s):  
Vahid Azimirad ◽  
Mohammad Fattahi Sani

SUMMARYIn this paper, the behavioral learning of robots through spiking neural networks is studied in which the architecture of the network is based on the thalamo-cortico-thalamic circuitry of the mammalian brain. According to a variety of neurons, the Izhikevich model of single neuron is used for the representation of neuronal behaviors. One thousand and ninety spiking neurons are considered in the network. The spiking model of the proposed architecture is derived and prepared for the learning problem of robots. The reinforcement learning algorithm is based on spike-timing-dependent plasticity and dopamine release as a reward. It results in strengthening the synaptic weights of the neurons that are involved in the robot’s proper performance. Sensory and motor neurons are placed in the thalamus and cortical module, respectively. The inputs of thalamo-cortico-thalamic circuitry are the signals related to distance of the target from robot, and the outputs are the velocities of actuators. The target attraction task is used as an example to validate the proposed method in which dopamine is released when the robot catches the target. Some simulation studies, as well as experimental implementation, are done on a mobile robot named Tabrizbot. Experimental studies illustrate that after successful learning, the meantime of catching target is decreased by about 36%. These prove that through the proposed method, thalamo-cortical structure could be trained successfully to learn to perform various robotic tasks.


Sign in / Sign up

Export Citation Format

Share Document