scholarly journals Monitoring and characterization of vibration and shock conditions aboard high-performance marine craft

Author(s):  
Manudul Pahansen de Alwis ◽  
Karl Garme

The stochastic environmental conditions together with craft design and operational characteristics make it difficult to predict the vibration environments aboard high-performance marine craft, particularly the risk of impact acceleration events and the shock component of the exposure often being associated with structural failure and human injuries. The different timescales and the magnitudes involved complicate the real-time analysis of vibration and shock conditions aboard these craft. The article introduces a new measure, severity index, indicating the risk of severe impact acceleration, and proposes a method for real-time feedback on the severity of impact exposure together with accumulated vibration exposure. The method analyzes the immediate 60 s of vibration exposure history and computes the severity of impact exposure as for the present state based on severity index. The severity index probes the characteristic of the present acceleration stochastic process, that is, the risk of an upcoming heavy impact, and serves as an alert to the crew. The accumulated vibration exposure, important for mapping and logging the crew exposure, is determined by the ISO 2631:1997 vibration dose value. The severity due to the impact and accumulated vibration exposure is communicated to the crew every second as a color-coded indicator: green, yellow and red, representing low, medium and high, based on defined impact and dose limits. The severity index and feedback method are developed and validated by a data set of 27 three-hour simulations of a planning craft in irregular waves and verified for its feasibility in real-world applications by full-scale acceleration data recorded aboard high-speed planing craft in operation.

The rise of social media platforms like Twitter and the increasing adoption by people in order to stay connected provide a large source of data to perform analysis based on the various trends, events and even various personalities. Such analysis also provides insight into a person’s likes and inclinations in real time independent of the data size. Several techniques have been created to retrieve such data however the most efficient technique is clustering. This paper provides an overview of the algorithms of the various clustering methods as well as looking at their efficiency in determining trending information. The clustered data may be further classified by topics for real time analysis on a large dynamic data set. In this paper, data classification is performed and analyzed for flaws followed by another classification on the same data set.


2016 ◽  
Vol 110 (3) ◽  
pp. 463a
Author(s):  
Fuyu Kobirumaki-Shimozawa ◽  
Kotaro Oyama ◽  
Togo Shimozawa ◽  
Takashi Ohki ◽  
Takako Terui ◽  
...  

2019 ◽  
Vol 2019 ◽  
pp. 1-15 ◽  
Author(s):  
Jie Hong ◽  
Tianrang Li ◽  
Zhichao Liang ◽  
Dayi Zhang ◽  
Yanhong Ma

Aeroengines pursue high performance, and compressing blade-casing clearance has become one of the main ways to improve turbomachinery efficiency. Rub-impact faults occur frequently with clearance decreasing. A high-speed rotor-support-casing test rig was set up, and the mechanism tests of light and heavy rub-impact were carried out. A finite element model of the test rig was established, and the calculation results were in good agreement with the experimental results under both kinds of rub-impact conditions. Based on the actual blade-casing structure model, the effects of the major physical parameters including imbalance and material characteristics were investigated. During the rub-impact, the highest stress occurs at the blade tip first and then it is transmitted to the blade root. Deformation on the impact blade tip generates easily with decreased yield strength, and stress concentration at the blade tip occurs obviously with weaker stiffness. The agreement of the computation results with the experimental data indicates the method could be used to estimate rub-impact characteristics and is effective in design and analyses process.


2020 ◽  
Author(s):  
Davide Scafidi ◽  
Daniele Spallarossa ◽  
Matteo Picozzi ◽  
Dino Bindi

<p>Understanding the dynamics of faulting is a crucial target in earthquake source physics (Yoo et al., 2010). To study earthquake dynamics it is indeed necessary to look at the source complexity from different perspectives; in this regard, useful information is provided by the seismic moment (M0), which is a static measure of the earthquake size, and the seismic radiated energy (ER), which is connected to the rupture kinematics and dynamics (e.g. Bormann & Di Giacomo 2011a). Studying spatial and temporal evolution of scaling relations between scaled energy (i.e., e = ER/M0) versus the static measure of source dimension (M0) can provide valuable indications for understanding the earthquake generation processes, single out precursors of stress concentrations, foreshocks and the nucleation of large earthquakes (Picozzi et al., 2019). In the last ten years, seismology has undergone a terrific development. Evolution in data telemetry opened the new research field of real-time seismology (Kanamori 2005), which targets are the rapid determination of earthquake location and size, the timely implementation of emergency plans and, under favourable conditions, earthquake early warning. On the other hand, the availability of denser and high quality seismic networks deployed near faults made possible to observe very large numbers of micro-to-small earthquakes, which is pushing the seismological community to look for novel big data analysis strategies. Large earthquakes in Italy have the peculiar characteristic of being followed within seconds to months by large aftershocks of magnitude similar to the initial quake or even larger, demonstrating the complexity of the Apennines’ faults system (Gentili and Giovanbattista, 2017). Picozzi et al. (2017) estimated the radiated seismic energy and seismic moment from P-wave signals for almost forty earthquakes with the largest magnitude of the 2016-2017 Central Italy seismic sequence. Focusing on S-wave signals recorded by local networks, Bindi et al. (2018) analysed more than 1400 earthquakes in the magnitude ranges 2.5 ≤ Mw ≤ 6.5 of the same region occurred from 2008 to 2017 and estimated both ER and M0, from which were derived the energy magnitude (Me) and Mw for investigating the impact of different magnitude scales on the aleatory variability associated with ground motion prediction equations. In this work, exploiting first steps made in this direction by Picozzi et al. (2017) and Bindi et al. (2018), we derived a novel approach for the real-time, robust estimation of seismic moment and radiated energy of small to large magnitude earthquakes recorded at local scales. In the first part of the work, we describe the procedure for extracting from the S-wave signals robust estimates of the peak displacement (PDS) and the cumulative squared velocity (IV2S). Then, exploiting a calibration data set of about 6000 earthquakes for which well-constrained M0 and theoretical ER values were available, we describe the calibration of empirical attenuation models. The coefficients and parameters obtained by calibration were then used for determining ER and M0 of a testing dataset</p>


2020 ◽  
Author(s):  
Markus Wiedemann ◽  
Bernhard S.A. Schuberth ◽  
Lorenzo Colli ◽  
Hans-Peter Bunge ◽  
Dieter Kranzlmüller

<p>Precise knowledge of the forces acting at the base of tectonic plates is of fundamental importance, but models of mantle dynamics are still often qualitative in nature to date. One particular problem is that we cannot access the deep interior of our planet and can therefore not make direct in situ measurements of the relevant physical parameters. Fortunately, modern software and powerful high-performance computing infrastructures allow us to generate complex three-dimensional models of the time evolution of mantle flow through large-scale numerical simulations.</p><p>In this project, we aim at visualizing the resulting convective patterns that occur thousands of kilometres below our feet and to make them "accessible" using high-end virtual reality techniques.</p><p>Models with several hundred million grid cells are nowadays possible using the modern supercomputing facilities, such as those available at the Leibniz Supercomputing Centre. These models provide quantitative estimates on the inaccessible parameters, such as buoyancy and temperature, as well as predictions of the associated gravity field and seismic wavefield that can be tested against Earth observations.</p><p>3-D visualizations of the computed physical parameters allow us to inspect the models such as if one were actually travelling down into the Earth. This way, convective processes that occur thousands of kilometres below our feet are virtually accessible by combining the simulations with high-end VR techniques.</p><p>The large data set used here poses severe challenges for real time visualization, because it cannot fit into graphics memory, while requiring rendering with strict deadlines. This raises the necessity to balance the amount of displayed data versus the time needed for rendering it.</p><p>As a solution, we introduce a rendering framework and describe our workflow that allows us to visualize this geoscientific dataset. Our example exceeds 16 TByte in size, which is beyond the capabilities of most visualization tools. To display this dataset in real-time, we reduce and declutter the dataset through isosurfacing and mesh optimization techniques.</p><p>Our rendering framework relies on multithreading and data decoupling mechanisms that allow to upload data to graphics memory while maintaining high frame rates. The final visualization application can be executed in a CAVE installation as well as on head mounted displays such as the HTC Vive or Oculus Rift. The latter devices will allow for viewing our example on-site at the EGU conference.</p>


2019 ◽  
Vol 16 (8) ◽  
pp. 3419-3427
Author(s):  
Shishir K. Shandilya ◽  
S. Sountharrajan ◽  
Smita Shandilya ◽  
E. Suganya

Big Data Technologies are well-accepted in the recent years in bio-medical and genome informatics. They are capable to process gigantic and heterogeneous genome information with good precision and recall. With the quick advancements in computation and storage technologies, the cost of acquiring and processing the genomic data has decreased significantly. The upcoming sequencing platforms will produce vast amount of data, which will imperatively require high-performance systems for on-demand analysis with time-bound efficiency. Recent bio-informatics tools are capable of utilizing the novel features of Hadoop in a more flexible way. In particular, big data technologies such as MapReduce and Hive are able to provide high-speed computational environment for the analysis of petabyte scale datasets. This has attracted the focus of bio-scientists to use the big data applications to automate the entire genome analysis. The proposed framework is designed over MapReduce and Java on extended Hadoop platform to achieve the parallelism of Big Data Analysis. It will assist the bioinformatics community by providing a comprehensive solution for Descriptive, Comparative, Exploratory, Inferential, Predictive and Causal Analysis on Genome data. The proposed framework is user-friendly, fully-customizable, scalable and fit for comprehensive real-time genome analysis from data acquisition till predictive sequence analysis.


2013 ◽  
Vol 94 (6) ◽  
pp. 859-882 ◽  
Author(s):  
Robert Rogers ◽  
Sim Aberson ◽  
Altug Aksoy ◽  
Bachir Annane ◽  
Michael Black ◽  
...  

An update of the progress achieved as part of the NOAA Intensity Forecasting Experiment (IFEX) is provided. Included is a brief summary of the noteworthy aircraft missions flown in the years since 2005, the first year IFEX flights occurred, as well as a description of the research and development activities that directly address the three primary IFEX goals: 1) collect observations that span the tropical cyclone (TC) life cycle in a variety of environments for model initialization and evaluation; 2) develop and refine measurement strategies and technologies that provide improved real-time monitoring of TC intensity, structure, and environment; and 3) improve the understanding of physical processes important in intensity change for a TC at all stages of its life cycle. Such activities include the real-time analysis and transmission of Doppler radar measurements; numerical model and data assimilation advancements; characterization of tropical cyclone composite structure across multiple scales, from vortex scale to turbulence scale; improvements in statistical prediction of rapid intensification; and studies specifically targeting tropical cyclogenesis, extratropical transition, and the impact of environmental humidity on TC structure and evolution. While progress in TC intensity forecasting remains challenging, the activities described here provide some hope for improvement.


Author(s):  
Huckleberry Febbo ◽  
Paramsothy Jayakumar ◽  
Jeffrey L. Stein ◽  
Tulga Ersal

Abstract Safe trajectory planning for high-performance automated vehicles in an environment with both static and moving obstacles is a challenging problem. Part of the challenge is developing a formulation that can be solved in real-time while including the following set of specifications: minimum time-to-goal, a dynamic vehicle model, minimum control effort, both static and moving obstacle avoidance, simultaneous optimization of speed and steering, and a short execution horizon. This paper presents a nonlinear model predictive control-based trajectory planning formulation, tailored for a large, high-speed unmanned ground vehicle, that includes the above set of specifications. The ability to solve this formulation in real-time is evaluated using NLOptControl, an open-source, direct-collocation based, optimal control problem solver in conjunction with the KNITRO nonlinear programming problem solver. The formulation is tested with various sets of the specifications. A parametric study relating execution horizon and obstacle speed indicates that the moving obstacle avoidance specification is not needed for safety when the planner has a small execution horizon and the obstacles are moving slowly. However, a moving obstacle avoidance specification is needed when the obstacles are moving faster, and this specification improves the overall safety without, in most cases, increasing the solve-times. The results indicate that (i) safe trajectory planners for high-performance automated vehicles should include the entire set of specifications mentioned above, unless a static or low-speed environment permits a less comprehensive planner; and (ii) the resulting formulation can be solved in real-time.


2020 ◽  
Vol 17 (3) ◽  
pp. 172988142093271
Author(s):  
Xiali Li ◽  
Manjun Tian ◽  
Shihan Kong ◽  
Licheng Wu ◽  
Junzhi Yu

To tackle the water surface pollution problem, a vision-based water surface garbage capture robot has been developed in our lab. In this article, we present a modified you only look once v3-based garbage detection method, allowing real-time and high-precision object detection in dynamic aquatic environments. More specifically, to improve the real-time detection performance, the detection scales of you only look once v3 are simplified from 3 to 2. Besides, to guarantee the accuracy of detection, the anchor boxes of our training data set are reclustered for replacing some of the original you only look once v3 prior anchor boxes that are not appropriate to our data set. By virtue of the proposed detection method, the capture robot has the capability of cleaning floating garbage in the field. Experimental results demonstrate that both detection speed and accuracy of the modified you only look once v3 are better than those of other object detection algorithms. The obtained results provide valuable insight into the high-speed detection and grasping of dynamic objects in complex aquatic environments autonomously and intelligently.


Sign in / Sign up

Export Citation Format

Share Document