scholarly journals Automatic coverage measurement of shot peened panels

2021 ◽  
Author(s):  
Lubna Shahid

Shot peening is the process of treating metallic surfaces with a regulated blast of shots to increase material strength and durability. Determining the coverage level of the shots is an important parameter in the assessment of the quality of treatment. Traditionally, coverage measurement is performed manually using a magnifying glass, which leads to inefficiency. Despite the proposal for the use of image segmentation techniques for determining the coverage measurement, literature on this topic is not extensively developed. In this thesis, various relevant image segmentation techniques are investigated including thresholding, edge detection, watershed segmentation, active contour, graph cut and neural network. The aim is to develop a generic coverage measurement algorithm, which is accurate and robust to variations in illumination, shot type, coverage level and has real-time capabilities using a simple experimental setup. The results obtained from each method are discussed and compared against a set of relevant performance criteria.

2021 ◽  
Author(s):  
Lubna Shahid

Shot peening is the process of treating metallic surfaces with a regulated blast of shots to increase material strength and durability. Determining the coverage level of the shots is an important parameter in the assessment of the quality of treatment. Traditionally, coverage measurement is performed manually using a magnifying glass, which leads to inefficiency. Despite the proposal for the use of image segmentation techniques for determining the coverage measurement, literature on this topic is not extensively developed. In this thesis, various relevant image segmentation techniques are investigated including thresholding, edge detection, watershed segmentation, active contour, graph cut and neural network. The aim is to develop a generic coverage measurement algorithm, which is accurate and robust to variations in illumination, shot type, coverage level and has real-time capabilities using a simple experimental setup. The results obtained from each method are discussed and compared against a set of relevant performance criteria.


Author(s):  
Megha Chhabra ◽  
Manoj Kumar Shukla ◽  
Kiran Kumar Ravulakollu

: Latent fingerprints are unintentional finger skin impressions left as ridge patterns at crime scenes. A major challenge in latent fingerprint forensics is the poor quality of the lifted image from the crime scene. Forensics investigators are in permanent search of novel outbreaks of the effective technologies to capture and process low quality image. The accuracy of the results depends upon the quality of the image captured in the beginning, metrics used to assess the quality and thereafter level of enhancement required. The low quality of the image collected by low quality scanners, unstructured background noise, poor ridge quality, overlapping structured noise result in detection of false minutiae and hence reduce the recognition rate. Traditionally, Image segmentation and enhancement is partially done manually using help of highly skilled experts. Using automated systems for this work, differently challenging quality of images can be investigated faster. This survey amplifies the comparative study of various segmentation techniques available for latent fingerprint forensics.


2013 ◽  
Vol 860-863 ◽  
pp. 2783-2786
Author(s):  
Yu Bing Dong ◽  
Hai Yan Wang ◽  
Ming Jing Li

Edge detection and thresholding segmentation algorithms are presented and tested with variety of grayscale images in different fields. In order to analyze and evaluate the quality of image segmentation, Root Mean Square Error is used. The smaller error value is, the better image segmentation effect is. The experimental results show that a segmentation method is not suitable for all images segmentation.


Author(s):  
Steven Wilcox ◽  
Richard Wilkins ◽  
Martin Lyons

Many organisations are currently dealing with long standing legacy issues in clean up, decommissioning and demolition projects. Industry is required to ensure that all bulk articles, substances and waste arisings are adequately characterised and assigned to the correct disposal routes in compliance with UK legislation and best practice. It is essential that data used to support waste sentencing is of the correct type, quality and quantity, and that it is appropriately assessed in order to support defensible, confident decisions that account for inherent uncertainties. AMEC has adopted the Data Quality Objectives (DQO) based methodology and the software package Visual Sample Plan (VSP) to provide a better, faster, and more cost effective approach to meeting regulatory and client requirements, whilst minimising the time spent gathering data and assessing the information. The DQO methodology is based on a scientific approach that requires clear objectives to be established from the outset of a project and that there is a demonstration of acceptability of the results. Through systematic planning, the team develops acceptance or performance criteria for the quality of the data collected and for the confidence in the final decision. The systematic planning process promotes communication between all departments and individuals involved in the decision-making process thus the planning phase gives an open and unambiguous method to support the decisions and enables the decision-makers (technical authorities on the materials of concern) to document all assumptions. The DQO process allows better planning, control and understanding of all the issues. All types of waste can be sentenced under one controllable system providing a more defensible position. This paper will explain that the DQO process consists of seven main steps that lead to a detailed Sampling and Analysis Plan (SAP). The process gives transparency to any assumptions made about the site or material being characterised and identifies individuals involved. The associated calculation effort is reduced using the statistically based sampling models produced with VSP. The first part of this paper explains the DQO based methodology and Visual Sample Plan and the second part shows how the DQO process has been applied in practice.


1990 ◽  
Vol 16 (2) ◽  
Author(s):  
H. A. Labuschagne ◽  
M. L. Watkins

Identification of criteria for academic research performance. At South African universities, the achievement of objectives is usually measured in terms of so-called "process criteria" (e.g. pass rates), instead of performance criteria which reflect the quality of academic personnel. Stimulated by the need to identify valid indices of research performance, as a component of academic performance, this study investigated the dimensionality of several criteria, identified from empirical and literature studies. It was found that various valid criteria could be represented by six constructs, viz.: the stature of the researcher as scientist; scientific contributions; enhancement of own profession; community development; participation in research projects; and giving advice to persons or institutions outside the university. Opsomming By Suid-Afrikaanse universiteite word doelwitbereiking gewoonlik aan die hand van sogenaamde "prosesmaat-stawwe" (bv. slaagsyfers) in plaas van prestasiemaatstawwe wat die gehalte van akademiese personeel weerspieel, gemeet. Na aanleiding van 'n behoefte aan die identifisering van geldige rigtingwysers vir navorsingsprestasie as 'n komponent van akademiese prestasie, is daar ondersoek ingestel na die dimensionaliteit van verskillende maatstawwe wat vooraf deur middel van empiriese- en literatuurstudies geidentifiseer is. Daar is gevind dat verskeie geldige maatstawwe deur ses konstrukte verteenwoordig word, te wete: die statuur van die navorser as wetenskaplike, wetenskaplike bydraes, uitbouing van eie professie, gemeenskapsontwikkeling, deelname aan navorsingsprojekte en advieslewering aan persone of instellings buite die Universiteit.


2021 ◽  
Vol 05 (01) ◽  
pp. 04-10
Author(s):  
Sabir Babaev ◽  
Ibrahim Habibov ◽  
Zohra Abiyeva

Prospects for the further development of the oil and gas industry are mainly associated with the development and commissioning of high-rate fields. In this regard, the production of more economical and durable equipment by machine-building enterprises, an increase in the level of its reliability and competitiveness, as well as further improvement of technological production processes, is of paramount importance. The evolution of technology in a broad sense is a representation of changes in designs, manufacturing technology, their direction and patterns. In this case, a certain state of any class of TC is considered as a result of long-term changes in its previous state; transition from existing and applied in practice vehicles to new models that differ from previous designs. These transitions, as a rule, are associated with the improvement of any performance criteria or quality indicators of the vehicle and are progressive in nature. The work is devoted to the study of the evolution of the quality of high-pressure valves during the period of their intensive development. Keywords: technical system, evolution of technology, high-pressure valves, shut-off devices, gate.


2018 ◽  
Vol 7 (4.33) ◽  
pp. 41
Author(s):  
Abdul K Jumaat ◽  
Ke Chen

Selective image segmentation model aims to separate a specific object from its surroundings. To solve the model, the common practice to deal with its non-differentiable term is to approximate the original functional. While this approach yields to successful segmentation result, however the segmentation process can be slow. In this paper, we showed how to solve the model without approximation using Chambolle’s projection algorithm. Numerical tests show that good visual quality of segmentation is obtained in a fast-computational time.  


1993 ◽  
Vol 30 (1) ◽  
pp. 51-64
Author(s):  
Ray Thomas ◽  
Fariborz Zahedi

Hybrid image segmentation within a computer vision hierarchy A generic model of a computer vision system is presented which highlights the critical role of image segmentation. A hybrid segmentation approach, utilising both edge-based and region-based techniques, is proposed for improved quality of segmentation. An image segmentation architecture is outlined and test results are presented and discussed.


2021 ◽  
Vol 11 (12) ◽  
pp. 3174-3180
Author(s):  
Guanghui Wang ◽  
Lihong Ma

At present, heart disease not only has a significant impact on the quality of human life but also poses a greater impact on people’s health. Therefore, it is very important to be able to diagnose heart disease as early as possible and give corresponding treatment. Heart image segmentation is the primary operation of intelligent heart disease diagnosis. The quality of segmentation directly determines the effect of intelligent diagnosis. Because the running time of image segmentation is often longer, coupled with the characteristics of cardiac MR imaging technology and the structural characteristics of the cardiac target itself, the rapid segmentation of cardiac MRI images still has challenges. Aiming at the long running time of traditional methods and low segmentation accuracy, a medical image segmentation (MIS) method based on particle swarm optimization (PSO) optimized support vector machine (SVM) is proposed, referred to as PSO-SVM. First, the current iteration number and population number in PSO are added to the control strategy of inertial weight λ to improve the performance of PSO inertial weight λ. Find the optimal penalty coefficient C and γ in the gaussian kernel function by PSO. Then use the SVM method to establish the best classification model and test the data. Compared with traditional methods, this method not only shortens the running time, but also improves the segmentation accuracy. At the same time, comparing the influence of traditional inertial weights on segmentation results, the improved method reduces the average convergence algebra and shortens the optimization time.


Author(s):  
Georgios I. Tsiropoulos ◽  
Dimitrios G. Stratogiannis ◽  
John D. Kanellopoulos ◽  
Panayotis G. Cottis

Admission control is one of the key elements for ensuring the quality of service (QoS) in modern mobile wireless networks. Since such networks are resource constrained, supporting multimedia traffic guaranteeing its QoS levels is excessively challenging for call admission control (CAC) design. CAC is the most important radio resource management (RRM) function in wireless networks as its efficiency has a direct impact on network performance and QoS provision to end users. The goal of this chapter is to provide a thorough study of the basic concepts considering CAC design and a comprehensive analysis of the fundamental CAC schemes employed in wireless networks. The basic performance criterion considering CAC schemes is the probability of denying the access to the network for an arriving call, which is extensively studied in this chapter. Moreover, additional performance criteria are presented and discussed, which may help to provide an overall efficiency estimation of the available CAC schemes.


Sign in / Sign up

Export Citation Format

Share Document