scholarly journals Computerised grinding procedure for large scale crankshaft machining

2019 ◽  
Vol 252 ◽  
pp. 01002
Author(s):  
Zbigniew Siemiątkowski

The paper describes sequence of machining operations that leads to the desired quality of the produced crankshaft, as well as in-situ inspection, correction and compensation procedures performed and controlled by computer. The form deviation values after correction are being compared with those obtained before. In case of crank pins, values of form deviations, and hence those of corrections, are much larger than for main journals. During the measurement, the probe collects data from 3600 points per revolution, and then averaging procedure reduces data down to 360 points. There are several algorithms for data processing available, so the operator may choose the one most appropriate. Substantial difference between out-of-roundness values of main journals and crank pivot was registered. Before form compensation, the former was between 0.01 and 0.02 mm, while the latter were in range 0.07-0.09 mm. Program of grinding is parametric, i.e. at each stage of the process all values responsible for the tool movement undergo correction. The applied computer monitoring enabled to achieve the demanded quality of grinded surface, as well as dimensions and form deviations in the tolerances set by the product specifications. Form compensation procedure enabled to reduce peak-to-peak deviation from 30.37 μm down to 8.14 μm.

2021 ◽  
Vol 13 (2) ◽  
pp. 320
Author(s):  
José P. Granadeiro ◽  
João Belo ◽  
Mohamed Henriques ◽  
João Catalão ◽  
Teresa Catry

Intertidal areas provide key ecosystem services but are declining worldwide. Digital elevation models (DEMs) are important tools to monitor the evolution of such areas. In this study, we aim at (i) estimating the intertidal topography based on an established pixel-wise algorithm, from Sentinel-2 MultiSpectral Instrument scenes, (ii) implementing a set of procedures to improve the quality of such estimation, and (iii) estimating the exposure period of the intertidal area of the Bijagós Archipelago, Guinea-Bissau. We first propose a four-parameter logistic regression to estimate intertidal topography. Afterwards, we develop a novel method to estimate tide-stage lags in the area covered by a Sentinel-2 scene to correct for geographical bias in topographic estimation resulting from differences in water height within each image. Our method searches for the minimum differences in height estimates obtained from rising and ebbing tides separately, enabling the estimation of cotidal lines. Tidal-stage differences estimated closely matched those published by official authorities. We re-estimated pixel heights from which we produced a model of intertidal exposure period. We obtained a high correlation between predicted and in-situ measurements of exposure period. We highlight the importance of remote sensing to deliver large-scale intertidal DEM and tide-stage data, with relevance for coastal safety, ecology and biodiversity conservation.


Author(s):  
Boris A. Zakharov ◽  
Zoltan Gal ◽  
Dyanne Cruickshank ◽  
Elena V. Boldyreva

The quality of structural models for 1,2,4,5-tetrabromobenzene (TBB), C6H2Br4, based on data collected from a single crystal in a diamond anvil cell at 0.4 GPa in situ using two different diffractometers belonging to different generations have been compared, together with the effects of applying different data-processing strategies.


Author(s):  
Beatrice Rammstedt ◽  
Clemens M. Lechner ◽  
Daniel Danner

Abstract. Researchers wishing to assess personality in research settings with severe time limitations typically use short-scale measures of the Big Five. Over the last decade, several such measures have been developed. To guide researchers in choosing the one best suited to their needs, we conducted the present study. Based on a large-scale sample representative of the adult population in Germany, we compared the psychometric properties of three short-scale versions assessing the Big Five: the 10-item BFI-10, the 15-item BFI-2-XS, and the 30-item BFI-2-S. To assess the psychometric quality of these measures, we investigated and compared the descriptive statistics and reliabilities of the scale scores as well as the patterns of factor loadings and the model fit of the instruments as indicators of their factorial validity. As the typical research settings in which these short measures are administered are heterogeneous population samples, we investigated to what degree the resulting Big Five estimates were comparable across major sociodemographic groups (age, gender, and educational strata). Finally, we compared the validity of the three measures for a set of external criteria. Results indicate that the latent Big Five domains can be assessed adequately with all three measures, which were found to have high psychometric quality, with coefficients of mostly comparable size.


2015 ◽  
Vol 8 (12) ◽  
pp. 12559-12588 ◽  
Author(s):  
A. Kräuchi ◽  
R. Philipona ◽  
G. Romanens ◽  
D. F. Hurst ◽  
E. G. Hall ◽  
...  

Abstract. In situ upper-air measurements are often made with instruments attached to weather balloons launched at the surface and lifted into the stratosphere. Present day balloon-borne sensors allow near-continuous measurements from the Earth's surface to about 35 km (3–5 hPa), where the balloons burst and their instrument payloads descend with parachutes. It has been demonstrated that ascending weather balloons can perturb the air measured by very sensitive humidity and temperature sensors trailing behind them, particularly in the upper troposphere and lower stratosphere (UTLS). The use of controlled balloon descent for such measurements has therefore been investigated and is described here. We distinguish between the one balloon technique that uses a simple automatic valve system to release helium from the balloon at a pre-set ambient pressure, and the double balloon technique that uses a carrier balloon to lift the payload and a parachute balloon to control the descent of instruments after the carrier balloon is released at pre-set altitude. The automatic valve technique has been used for several decades for water vapor soundings with frost point hygrometers, whereas the double balloon technique has recently been re-established and deployed to measure radiation and temperature profiles through the atmosphere. Double balloon soundings also strongly reduce pendulum motion of the payload, stabilizing radiation instruments during ascent. We present the flight characteristics of these two ballooning techniques and compare the quality of temperature and humidity measurements made during ascent and descent.


Electronics ◽  
2021 ◽  
Vol 10 (18) ◽  
pp. 2197
Author(s):  
Bruno Citoni ◽  
Shuja Ansari ◽  
Qammer Hussain Abbasi ◽  
Muhammad Ali Imran ◽  
Sajjad Hussain

The large-scale behaviour of LoRaWAN networks has been studied through mathematical analysis and discrete-time simulations to understand their limitations. However, current literature is not always coherent in its assumptions and network setups. This paper proposes a comprehensive analysis of the known causes of packet loss in an uplink-only LoRaWAN network: duty cycle limitations, packet collision, insufficient coverage, and saturation of a receiver’s demodulation paths. Their impact on the overall Quality of Service (QoS) for a two-gateway network is also studied. The analysis is carried out with the discrete-event network simulator NS-3 and is set up to best fit the real behaviour of devices. This approach shows that increasing gateway density is only effective as the gateways are placed at a distance. Moreover, the trade-off between different outage conditions due to the uneven distribution of spreading factors is not always beneficial, diminishing returns as networks grow denser and wider. In particular, networks operating similarly to the one analysed in this paper should specifically avoid SF11 and 12, which decrease the average overall PDR by about 7% at 10% nodes increment across all configurations. The results of this work intend to homogenise behavioural assumptions and setups of future research investigating the capability of LoRaWAN networks and provide insight on the weight of each outage condition in a varying two-gateway network.


Nanomaterials ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 1179
Author(s):  
Raquel Montes ◽  
Gerard Sánchez ◽  
Jingjing Zhao ◽  
Cristina Palet ◽  
Mireia Baeza ◽  
...  

The incorporation of nanomaterials on (bio)sensors based on composite materials has led to important advances in the analytical chemistry field due to the extraordinary properties that these materials offer. Nanodiamonds (NDs) are a novel type of material that has raised much attention, as they have the possibility of being produced on a large scale by relatively inexpensive synthetic methodologies. Moreover, NDs can present some other interesting features, such as fluorescence, due to surface functionalization and proved biocompatibility, which makes them suitable for biomedical applications. In addition, NDs can be customized with metallic nanoparticles (NPs), such as silver or gold, in order to combine the features of both. Raw NDs were used as modifiers of sensors due to the electrocatalytic effect of the sp2 and oxygenated species present on their surface. The aim of this research work is evaluating the applicability of NDs modified with silver (Ag@NDs) and gold (Au@NDs) nanoparticles for the development of a suitable (bio)sensing platform. A complete morphological and electrochemical characterization as a function of the prepared nanocomposite composition was performed in order to improve the electroanalytical properties of the developed (bio)sensors. In the present work, the optimal composition for Au@NDs present on the nanocomposite matrix is 3.5% and the one for Ag@NDs is 1%. Good results were obtained in the evaluation of the optimal composition towards hydrogen peroxide and glucose as a model analyte using a (bio)sensor based on graphite-epoxy-Ag@NDs (17:82:1).


Irrigated agriculture is expected to play a major role in reaching the broader development objectives of achieving food security and improvements in the quality of life, while conserving the environment, in both the developed and developing countries. Especially as we are faced with the prospect of global population growth from almost 6 billion today to at least 8 billion by 2025 [1]. In this context, the prospects of increasing the gross cultivated area, in both the developed and developing countries, are limited by the dwindling number of economically attractive sites for new large¬ scale irrigation and drainage projects. Therefore, any increase in agricultural production will necessarily rely largely on a more accurate estimation of crop water requirements on the one hand, and on major improvements in the operation, management and performance of existing irrigation and drainage systems, on the other.


2009 ◽  
Vol 19 (03) ◽  
pp. 399-418 ◽  
Author(s):  
JENS GUSTEDT ◽  
EMMANUEL JEANNOT ◽  
MARTIN QUINSON

The increasing complexity of available infrastructures with specific features (caches, hyperthreading, dual core, etc.) or with complex architectures (hierarchical, parallel, distributed, etc.) makes it extremely difficult to build analytical models that allow for a satisfying prediction. Hence, it raises the question on how to validate algorithms if a realistic analytic analysis is not possible any longer. As for some many other sciences, the one answer is experimental validation. Nevertheless, experimentation in Computer Science is a difficult subject that today still opens more questions than it solves: What may an experiment validate? What is a "good experiment"? How to build an experimental environment that allows for "good experiments"? etc. In this paper we will provide some hints on this subject and show how some tools can help in performing "good experiments", mainly in the context of parallel and distributed computing. More precisely we will focus on four main experimental methodologies, namely in-situ (real-scale) experiments (with an emphasis on PlanetLab and Grid'5000), Emulation (with an emphasis on Wrekavoc) benchmarking and simulation (with an emphasis on SimGRID and GridSim). We will provide a comparison of these tools and methodologies from a quantitative but also qualitative point of view.


2019 ◽  
Vol 41 (1) ◽  
pp. 91-100 ◽  
Author(s):  
Amina Malek ◽  
Mohamed Kahoul ◽  
Hamza Bouguerra

Abstract Drinking water is a possible source of humans’ illness when it contains chemicals and microorganisms especially from anthropogenic activities. The water supply from groundwater remains very important in Algeria. To assess the quality of groundwater in the region of Sedrata, analyses were carried out on 26 wells belonging to two neighbouring areas: one urban and the other rural. A study of physicochemical parameters has focused on the measurement of in situ temperature, electrical conductivity, pH and turbidity. Then the following parameters were analysed: hardness, and the elements: Ca2+, Mg2+, SO42−, PO43−, Cl−, NO2−, NO3−, NH4+ as well as metal trace elements Fe2+, Mn2+, Al3+. The samples taken for the bacteriological study were filtered and introduced into growth medium for the research and enumeration of total germs, faecal coliforms, faecal streptococci and sulphite reducing Clostridium. As a result, the contamination of the studied waters is almost general. Some of the most important obtained values are ranging from 4.8 to 76 mg∙dm−3 for nitrates, the recorded values for mesophilic germs vary from 1 to 1100 CFUs∙cm−3. Agricultural activity and livestock products on the one hand and the use of fertilizers on the other hand are the main sources of physicochemical and bacteriological pollution. Contaminated wells should be treated as soon as possible to limit contamination before spreading in the deep aquifers. In the future, it will be necessary not only to assess the health risks related to the level of contamination of these waters, but also to proceed with their treatment before supplying them to consumers.


Author(s):  
Yabin Ding ◽  
Zeyang Zhang ◽  
Xianping Liu ◽  
Jinsheng Fu ◽  
Tian Huang

The high demand of efficient large-scale machining operations by concurrently decreasing operating time and costs has led to an increasing usage of mobile robotic systems. This paper introduces a mobile robotic system which is consisted of a hybrid robot named TriMule on an automated guided vehicle, and a fringe-projection-based measurement system. TriMule exhibits desirable performance in terms of rigidity, accuracy, work envelop and reconfigurability. It is therefore suitable to be built on an autonomous platform for multi-station manufacturing in situ. In order to increase the absolute accuracy of the mobile robotic system, the fringe-projection-based measurement system obtains high accuracy and high density cloud to measure the position and orientation of the robot and workpiece in relation to each other. This system is suitable for large-scale manufacturing in situ, drilling, riveting and high-speed milling for example.


Sign in / Sign up

Export Citation Format

Share Document