scholarly journals Development of an Automatic Image Cropping and Feature Extraction System for Real-Time Warping Monitoring in 3D Printing

Author(s):  
Jiarui Xie

Fused Filament Fabrication (FFF) is an additive manufacturing technology that can produce complicated structures in a simple-to-use and cost-effective manner. Although promising, the technology is prone to defects, e.g. warping, compromising the quality of the manufactured component. To avoid the adverse effects caused by warping, this thesis utilizes deep-learning algorithms to develop a warping detection system using Convolutional Neural Networks (CNN). To create such a system, a real-time data acquisition and analysis pipeline is laid out. The system is responsible for capturing a snapshot of the print layer-bylayer and simultaneously extracting the corners of the component. The extracted region-of-interest is then passed through a CNN outputting the probability of a corner being warped. If a warp is detected, a signal is sent to pause the print, thereby creating a closed-loop monitoring system. The underlying model is tested on a real-time manufacturing environment yielding a mean accuracy of 99.21%.

Author(s):  
Deepak T. Mohan ◽  
Jeffrey Birt ◽  
Can Saygin ◽  
Jaganathan Sarangapani

Fastening operations are extensively used in the aerospace industry and constitute for more than a quarter of the total cost. Inspection of fasteners is another factor that adds cost and complexity to the overall process. Inspection is usually carried out on a sampling-basis as a stand-alone process after the fastening process is completed. Lack of capability to inspect all fasteners in a cost effective manner and the need to remove non-value added activities, such as inspection by itself, in order to reduce the manufacturing lead time have been the motivation behind this study. This paper presents a novel diagnostics scheme based on Mahalanobis-Taguchi System (MTS) for monitoring the quality of rotary-type fastening operations in real-time. This approach encompasses (1) integrating a torque sensor, a pressure sensor, and an optical encoder on a hand-held rotary-type fastening tool; (2) obtaining process parameters via the embedded sensors and generating process signatures in real-time; and (3) detecting anomalies on the tool using a wireless mote that communicates the decision with a base station. The anomalies investigated in this study are the grip length variations as under grip and normal grip, and presence of re-used fasteners. The proposed scheme has been implemented on prototype rotary tool for bolt-nut type of fasteners and tested under a variety of experimental settings. The experimental results have shown that the proposed approach is successful, with an accuracy of over 95% in detecting grip lengths of fasteners in real-time during the process.


Energies ◽  
2020 ◽  
Vol 14 (1) ◽  
pp. 89
Author(s):  
Khalid Haseeb ◽  
Naveed Islam ◽  
Yasir Javed ◽  
Usman Tariq

The Wireless Sensor Network (WSN) has seen rapid growth in the development of real-time applications due to its ease of management and cost-effective attributes. However, the balance between optimization of network lifetime and load distribution between sensor nodes is a critical matter for the development of energy-efficient routing solutions. Recently, many solutions have been proposed for constraint-based networks using the cloud paradigm. However, they achieve network scalability with the additional cost of routing overheads and network latency. Moreover, the sensors’ data is transmitted towards application users over the uncertain medium, which leads to compromised data security and its integrity. Therefore, this work proposes a light-weight secure and energy-efficient fog-based routing (SEFR) protocol to minimize data latency and increase energy management. It exploits the Quality of Service (QoS) factors and facilitates time-sensitive applications with network edges. Moreover, the proposed protocol protects real-time data based on two levels of cryptographic security primitives. In the first level, a lightweight data confidentiality scheme is proposed between the cluster heads and fog nodes, and in the second level, a high-performance asymmetric encryption scheme is proposed among fog and cloud layers. The analysis of simulation-based experiments has proven the significant outcomes of the proposed protocol compared to existing solutions in terms of routing, security, and network management.


Author(s):  
Paul Oehlmann ◽  
Paul Osswald ◽  
Juan Camilo Blanco ◽  
Martin Friedrich ◽  
Dominik Rietzel ◽  
...  

AbstractWith industries pushing towards digitalized production, adaption to expectations and increasing requirements for modern applications, has brought additive manufacturing (AM) to the forefront of Industry 4.0. In fact, AM is a main accelerator for digital production with its possibilities in structural design, such as topology optimization, production flexibility, customization, product development, to name a few. Fused Filament Fabrication (FFF) is a widespread and practical tool for rapid prototyping that also demonstrates the importance of AM technologies through its accessibility to the general public by creating cost effective desktop solutions. An increasing integration of systems in an intelligent production environment also enables the generation of large-scale data to be used for process monitoring and process control. Deep learning as a form of artificial intelligence (AI) and more specifically, a method of machine learning (ML) is ideal for handling big data. This study uses a trained artificial neural network (ANN) model as a digital shadow to predict the force within the nozzle of an FFF printer using filament speed and nozzle temperatures as input data. After the ANN model was tested using data from a theoretical model it was implemented to predict the behavior using real-time printer data. For this purpose, an FFF printer was equipped with sensors that collect real time printer data during the printing process. The ANN model reflected the kinematics of melting and flow predicted by models currently available for various speeds of printing. The model allows for a deeper understanding of the influencing process parameters which ultimately results in the determination of the optimum combination of process speed and print quality.


Author(s):  
Nicole Gailey ◽  
Noman Rasool

Canada and the United States have vast energy resources, supported by thousands of kilometers (miles) of pipeline infrastructure built and maintained each year. Whether the pipeline runs through remote territory or passing through local city centers, keeping commodities flowing safely is a critical part of day-to-day operation for any pipeline. Real-time leak detection systems have become a critical system that companies require in order to provide safe operations, protection of the environment and compliance with regulations. The function of a leak detection system is the ability to identify and confirm a leak event in a timely and precise manner. Flow measurement devices are a critical input into many leak detection systems and in order to ensure flow measurement accuracy, custody transfer grade liquid ultrasonic meters (as defined in API MPMS chapter 5.8) can be utilized to provide superior accuracy, performance and diagnostics. This paper presents a sample of real-time data collected from a field install base of over 245 custody transfer grade liquid ultrasonic meters currently being utilized in pipeline leak detection applications. The data helps to identify upstream instrumentation anomalies and illustrate the abilities of the utilization of diagnostics within the liquid ultrasonic meters to further improve current leak detection real time transient models (RTTM) and pipeline operational procedures. The paper discusses considerations addressed while evaluating data and understanding the importance of accuracy within the metering equipment utilized. It also elaborates on significant benefits associated with the utilization of the ultrasonic meter’s capabilities and the importance of diagnosing other pipeline issues and uncertainties outside of measurement errors.


2009 ◽  
Vol 66 (9) ◽  
pp. 1915-1918 ◽  
Author(s):  
Yuki Minegishi ◽  
Tatsuki Yoshinaga ◽  
Jun Aoyama ◽  
Katsumi Tsukamoto

Abstract Minegishi, Y., Yoshinaga, T., Aoyama, J., and Tsukamoto, K. 2009. Species identification of Anguilla japonica by real-time PCR based on a sequence detection system: a practical application to eggs and larvae. – ICES Journal of Marine Science, 66: 1915–1918. To develop a practical method for identifying Japanese eel Anguilla japonica eggs and larvae to species by a sequence detection system using a real-time polymerase chain reaction (PCR), we examined (i) the sensitivity of the system using samples at various developmental stages, and (ii) influences of intra- and interspecific DNA sequence variations in the PCR target region. PCR amplifications with extracted DNA solution at 7.0 ng µl−1 or lower were efficient at distinguishing A. japonica from other anguillids. A single egg at the gastrula or later developmental stages could also be identified. Two sequence variations in the PCR target region were observed in 2 out of 35 A. japonica collected from three localities, and from four year classes at a single locality. These mutations, however, did not affect the result of species identification achieved by A. japonica-specific PCR primers and probe. The accuracy of this PCR-based method of species identification will help in field surveys of the species.


2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Woochul Kang ◽  
Jaeyong Chung

With ubiquitous deployment of sensors and network connectivity, amounts of real-time data for embedded systems are increasing rapidly and database capability is required for many embedded systems for systematic management of real-time data. In such embedded systems, supporting the timeliness of tasks accessing databases is an important problem. However, recent multicore-based embedded architectures pose a significant challenge for such data-intensive real-time tasks since the response time of accessing data can be significantly affected by potential intercore interferences. In this paper, we propose a novel feedback control scheme that supports the timeliness of data-intensive tasks against unpredictable intercore interferences. In particular, we use multiple inputs/multiple outputs (MIMO) control method that exploits multiple control knobs, for example, CPU frequency and the Quality-of-Data (QoD) to handle highly unpredictable workloads in multicore systems. Experimental results, using actual implementation, show that the proposed approach achieves the target Quality-of-Service (QoS) goals, such as task timeliness and Quality-of-Data (QoD) while consuming less energy compared to baseline approaches.


2020 ◽  
Vol 26 (4) ◽  
pp. 496-507
Author(s):  
Kheir Daouadi ◽  
Rim Rebaï ◽  
Ikram Amous

Nowadays, bot detection from Twitter attracts the attention of several researchers around the world. Different bot detection approaches have been proposed as a result of these research efforts. Four of the main challenges faced in this context are the diversity of types of content propagated throughout Twitter, the problem inherent to the text, the lack of sufficient labeled datasets and the fact that the current bot detection approaches are not sufficient to detect bot activities accurately. We propose, Twitterbot+, a bot detection system that leveraged a minimal number of language-independent features extracted from one single tweet with temporal enrichment of a previously labeled datasets. We conducted experiments on three benchmark datasets with standard evaluation scenarios, and the achieved results demonstrate the efficiency of Twitterbot+ against the state-of-the-art. This yielded a promising accuracy results (>95%). Our proposition is suitable for accurate and real-time use in a Twitter data collection step as an initial filtering technique to improve the quality of research data.


2009 ◽  
Vol 26 (3) ◽  
pp. 556-569 ◽  
Author(s):  
Ananda Pascual ◽  
Christine Boone ◽  
Gilles Larnicol ◽  
Pierre-Yves Le Traon

Abstract The timeliness of satellite altimeter measurements has a significant effect on their value for operational oceanography. In this paper, an Observing System Experiment (OSE) approach is used to assess the quality of real-time altimeter products, a key issue for robust monitoring and forecasting of the ocean state. In addition, the effect of two improved geophysical corrections and the number of missions that are combined in the altimeter products are also analyzed. The improved tidal and atmospheric corrections have a significant effect in coastal areas (0–100 km from the shore), and a comparison with tide gauge observations shows a slightly better agreement with the gridded delayed-time sea level anomalies (SLAs) with two altimeters [Jason-1 and European Remote Sensing Satellite-2 (ERS-2)/Envisat] using the new geophysical corrections (mean square differences in percent of tide gauge variance of 35.3%) than those with four missions [Jason-1, ERS/Envisat, Ocean Topography Experiment (TOPEX)/Poseidoninterlaced, and Geosat Follow-On] but using the old corrections (36.7%). In the deep ocean, however, the correction improvements have little influence. The performance of fast delivery products versus delayed-time data is compared using independent in situ data (tide gauge and drifter data). It clearly highlights the degradation of real-time SLA maps versus the delayed-time SLA maps: four altimeters are needed in real time to get the similar quality performance as two altimeters in delayed time (sea level error misfit around 36%, and zonal and meridional velocity estimation errors of 27% and 33%, respectively). This study proves that the continuous improvement of geophysical corrections is very important, and that it is essential to stay above a minimum threshold of four available altimetric missions to capture the main space and time oceanic scales in fast delivery products.


Author(s):  
Manjunath Ramachandra ◽  
Vikas Jain

The present day Internet traffic largely caters for the multimedia traffic throwing open new and unthinkable applications such as tele-surgery. The complexity of data transactions increases with a demand for in time and real time data transfers, demanding the limited resources of the network beyond their capabilities. It requires a prioritization of data transfers, controlled dumping of data over the network etc. To make the matter worse, the data from different origin combine together imparting long lasting detrimental features such as self similarity and long range dependency in to the traffic. The multimedia data fortunately is associated with redundancies that may be removed through efficient compression techniques. There exists a provision to control the compression or bitrates based on the availability of resources in the network. The traffic controller or shaper has to optimize the quality of the transferred multimedia data depending up on the state of the network. In this chapter, a novel traffic shaper is introduced considering the adverse properties of the network and counteract with the same.


Sign in / Sign up

Export Citation Format

Share Document