scholarly journals An Integrated Approach between Computing and Mathematical Modelling for Cattle Welfare in Grazing Systems

2021 ◽  
Vol 22 (4) ◽  
pp. 629-643
Author(s):  
R. M. O. Santos ◽  
E. F. Saraiva ◽  
R. R. Santos

In the last years, the agricultural systems based on Crop-Livestock-Forestry integrationhave emerged as a potential solution due to its capacity to maximize land use and reduces the effects of high temperatures on the animals. Within these systems, there exist an interest in technological solutions capable of monitor the animals in real-time. From this monitoring, one of the main interest is to know if an animal is in the sun or in the shade of a tree by using some environmental measures. However, as there is a possibility that the weather is cloudy, real-time monitoring also needs to identify this case. That is, the realtime monitoring also needs to differentiate the shade of a tree from a cloudy weather. The interest in this kind of monitoring is due to the fact that an animal that remains a long time under a shade of a tree provides substantial insights to indicate if this is in thermal stress. This information can be used in decision-making with the goal to reduce the impact of the thermal stress and consequently to provide welfare to the animal and reduces the financial losses. As a solution to identify if an animal is in the sun or in the shade of a tree or if the weather is cloudy, we developed an electronic device, used to capture values of environmental variables, which integrated with a mathematical model predicts the shade state (sun, shade or cloudy) where the animal can be found. We illustrate the performance of the proposed solution in a real data set.

Author(s):  
Manudul Pahansen de Alwis ◽  
Karl Garme

The stochastic environmental conditions together with craft design and operational characteristics make it difficult to predict the vibration environments aboard high-performance marine craft, particularly the risk of impact acceleration events and the shock component of the exposure often being associated with structural failure and human injuries. The different timescales and the magnitudes involved complicate the real-time analysis of vibration and shock conditions aboard these craft. The article introduces a new measure, severity index, indicating the risk of severe impact acceleration, and proposes a method for real-time feedback on the severity of impact exposure together with accumulated vibration exposure. The method analyzes the immediate 60 s of vibration exposure history and computes the severity of impact exposure as for the present state based on severity index. The severity index probes the characteristic of the present acceleration stochastic process, that is, the risk of an upcoming heavy impact, and serves as an alert to the crew. The accumulated vibration exposure, important for mapping and logging the crew exposure, is determined by the ISO 2631:1997 vibration dose value. The severity due to the impact and accumulated vibration exposure is communicated to the crew every second as a color-coded indicator: green, yellow and red, representing low, medium and high, based on defined impact and dose limits. The severity index and feedback method are developed and validated by a data set of 27 three-hour simulations of a planning craft in irregular waves and verified for its feasibility in real-world applications by full-scale acceleration data recorded aboard high-speed planing craft in operation.


2017 ◽  
Vol 9 (2) ◽  
pp. 169-186 ◽  
Author(s):  
Liang Zhao ◽  
Tsvi Vinig

Purpose In the existing literature on crowdfunding project performance, previous studies have given little attention to the impact of investors’ hedonic value and utilitarian value on project results. In a crowdfunding setting, utilitarian value is somehow hard to satisfy due to information asymmetry and adverse selection problem. Therefore, the projects with more hedonic value can be more attractive for potential investors. Lucky draw is a method to increase consumer hedonic value, and it can influence investors’ behavior as a result. The authors hypothesize that projects with hedonic treatment (lucky draw) may have higher probability to win their campaign than others. The paper aims to discuss these issues. Design/methodology/approach A unique self-extracted two-year Chinese crowdfunding platform real data set has been applied as the analysis sample. The authors first employ propensity score matching methods to control for the endogeneity of hedonic treatment adoption (lucky draw). The authors then run OLS regression and probit regression in order to test the hypotheses. Findings The analysis suggests a significant positive relationship not only between project lottery adoption and project results but also between project lottery adoption and project popularity. Originality/value The results suggest that an often ignored factor – hedonic treatment (lucky draw) – can play an important role in crowdfunding project performance.


2020 ◽  
Author(s):  
Davide Scafidi ◽  
Daniele Spallarossa ◽  
Matteo Picozzi ◽  
Dino Bindi

<p>Understanding the dynamics of faulting is a crucial target in earthquake source physics (Yoo et al., 2010). To study earthquake dynamics it is indeed necessary to look at the source complexity from different perspectives; in this regard, useful information is provided by the seismic moment (M0), which is a static measure of the earthquake size, and the seismic radiated energy (ER), which is connected to the rupture kinematics and dynamics (e.g. Bormann & Di Giacomo 2011a). Studying spatial and temporal evolution of scaling relations between scaled energy (i.e., e = ER/M0) versus the static measure of source dimension (M0) can provide valuable indications for understanding the earthquake generation processes, single out precursors of stress concentrations, foreshocks and the nucleation of large earthquakes (Picozzi et al., 2019). In the last ten years, seismology has undergone a terrific development. Evolution in data telemetry opened the new research field of real-time seismology (Kanamori 2005), which targets are the rapid determination of earthquake location and size, the timely implementation of emergency plans and, under favourable conditions, earthquake early warning. On the other hand, the availability of denser and high quality seismic networks deployed near faults made possible to observe very large numbers of micro-to-small earthquakes, which is pushing the seismological community to look for novel big data analysis strategies. Large earthquakes in Italy have the peculiar characteristic of being followed within seconds to months by large aftershocks of magnitude similar to the initial quake or even larger, demonstrating the complexity of the Apennines’ faults system (Gentili and Giovanbattista, 2017). Picozzi et al. (2017) estimated the radiated seismic energy and seismic moment from P-wave signals for almost forty earthquakes with the largest magnitude of the 2016-2017 Central Italy seismic sequence. Focusing on S-wave signals recorded by local networks, Bindi et al. (2018) analysed more than 1400 earthquakes in the magnitude ranges 2.5 ≤ Mw ≤ 6.5 of the same region occurred from 2008 to 2017 and estimated both ER and M0, from which were derived the energy magnitude (Me) and Mw for investigating the impact of different magnitude scales on the aleatory variability associated with ground motion prediction equations. In this work, exploiting first steps made in this direction by Picozzi et al. (2017) and Bindi et al. (2018), we derived a novel approach for the real-time, robust estimation of seismic moment and radiated energy of small to large magnitude earthquakes recorded at local scales. In the first part of the work, we describe the procedure for extracting from the S-wave signals robust estimates of the peak displacement (PDS) and the cumulative squared velocity (IV2S). Then, exploiting a calibration data set of about 6000 earthquakes for which well-constrained M0 and theoretical ER values were available, we describe the calibration of empirical attenuation models. The coefficients and parameters obtained by calibration were then used for determining ER and M0 of a testing dataset</p>


2021 ◽  
Vol 15 (8) ◽  
pp. e0009711
Author(s):  
Shuaibu Ahijo Abdullahi ◽  
Abdulrazaq Garba Habib ◽  
Nafiu Hussaini

A mathematical model is designed to assess the impact of some interventional strategies for curtailing the burden of snakebite envenoming in a community. The model is fitted with real data set. Numerical simulations have shown that public health awareness of the susceptible individuals on snakebite preventive measures could reduce the number of envenoming and prevent deaths and disabilities in the population. The simulations further revealed that if at least fifty percent of snakebite envenoming patients receive early treatment with antivenom a substantial number of deaths will be averted. Furthermore, it is shown using optimal control that combining public health awareness and antivenom treatment averts the highest number of snakebite induced deaths and disability adjusted life years in the study area. To choose the best strategy amidst limited resources in the study area, cost effectiveness analysis in terms of incremental cost effectiveness ratio is performed. It has been established that the control efforts of combining public health awareness of the susceptible individuals and antivenom treatment for victims of snakebite envenoming is the most cost effective strategy. Approximately the sum of US$72,548 is needed to avert 117 deaths or 2,739 disability adjusted life years that are recorded within 21 months in the study area. Thus, the combination of these two control strategies is recommended.


2018 ◽  
Vol 59 (4) ◽  
pp. 716-748
Author(s):  
Paul Seaborn ◽  
Tricia D. Olsen ◽  
Jason Howell

Corporate environmental performance has become a key focus of business leaders, policy makers, and scholars alike. Today, scholarship on environmental practice increasingly highlights how various aspects of corporate governance can influence environmental performance. However, the prior literature is inconclusive as to whether ownership by insiders (officers and directors) will have positive or negative environmental effects and whether insider voting control or equity control is more salient to environmental outcomes. This article leverages a unique empirical data set of dual-class firms, where insiders have voting rights disproportionate to their equity rights, to shed light on this question. We find that, on average, dual-class firms underperform their single-class peers on environmental measures and that the discrepancy comes from dual-class firms where insiders have more voting control, relative to their equity stake. While small increases in voting control are associated with improved environmental performance, too much (relative to insiders’ equity stake) worsens firms’ environmental performance. Insider equity control alone has no impact on environmental outcomes. Our findings have important implications for agency theory and environmental scholarship by identifying contingencies on the impact of voting and equity-based incentives. This research casts doubt on the idea that providing insiders with significant voting control will aid environmental performance.


2014 ◽  
Vol 998-999 ◽  
pp. 873-877
Author(s):  
Zhen Bo Wang ◽  
Bao Zhi Qiu

To reduce the impact of irrelevant attributes on clustering results, and improve the importance of relevant attributes to clustering, this paper proposes fuzzy C-means clustering algorithm based on coefficient of variation (CV-FCM). In the algorithm, coefficient of variation is used to weigh attributes so as to assign different weights to each attribute in the data set, and the magnitude of weight is used to express the importance of different attributes to clusters. In addition, for the characteristic of fuzzy C-means clustering algorithm that it is susceptible to initial cluster center value, the method for the selection of initial cluster center based on maximum distance is introduced on the basis of weighted coefficient of variation. The result of the experiment based on real data sets shows that this algorithm can select cluster center effectively, with the clustering result superior to general fuzzy C-means clustering algorithms.


2018 ◽  
Vol 11 (5) ◽  
pp. 2669-2681 ◽  
Author(s):  
P. Morten Hundt ◽  
Michael Müller ◽  
Markus Mangold ◽  
Béla Tuzson ◽  
Philipp Scheidegger ◽  
...  

Abstract. Detailed knowledge about the urban NO2 concentration field is a key element for obtaining accurate pollution maps and individual exposure estimates. These are required for improving the understanding of the impact of ambient NO2 on human health and for related air quality measures. However, city-scale NO2 concentration maps with high spatio-temporal resolution are still lacking, mainly due to the difficulty of accurate measurement of NO2 at the required sub-ppb level precision. We contribute to close this gap through the development of a compact instrument based on mid-infrared laser absorption spectroscopy. Leveraging recent advances in infrared laser and detection technology and a novel circular absorption cell, we demonstrate the feasibility and robustness of this technique for demanding mobile applications. A fully autonomous quantum cascade laser absorption spectrometer (QCLAS) has been successfully deployed on a tram, performing long-term and real-time concentration measurements of NO2 in the city of Zurich (Switzerland). For ambient NO2 concentrations, the instrument demonstrated a precision of 0.23 ppb at one second time resolution and of 0.03 ppb after 200 s averaging. Whilst the combined uncertainty estimated for the retrieved spectroscopic values was less than 5 %, laboratory intercomparison measurements with standard CLD instruments revealed a systematic NO2 wall loss of about 10 % within the laser spectrometer. For the field campaign, the QCLAS has been referenced to a CLD using urban atmospheric air, despite the potential cross sensitivity of CLD to other nitrogen containing compounds. However, this approach allowed a direct comparison and continuous validation of the spectroscopic data to measurements at regulatory air quality monitoring (AQM) stations along the tram-line. The analysis of the recorded high-resolution time series allowed us to gain more detailed insights into the spatio-temporal concentration distribution of NO2 in an urban environment. Furthermore, our results demonstrate that for reliable city-scale concentration maps a larger data set and better spatial coverage is needed, e.g., by deploying more mobile and stationary instruments to account for mainly two shortcomings of the current approach: (i) limited residence time close to sources with large short-term NO2 variations, and (ii) insufficient representativeness of the tram tracks for the complex urban environment.


2018 ◽  
Vol 2018 ◽  
pp. 1-12
Author(s):  
Li Fang ◽  
Xiwu Zhan ◽  
Christopher R. Hain ◽  
Jicheng Liu

Green vegetation fraction (GVF) is one of the input parameters of the Noah land surface model (LSM) that is the land component of a number of operational numerical weather prediction (NWP) models at the National Centers for Environmental Prediction (NCEP) of NOAA. The Noah LSM in current NCEP operational NWP models has been using static multiyear averages of monthly GVF derived from satellite observations of NOAA Advanced Very High Resolution Radiometer (AVHRR) normalized difference vegetation index. The multiyear averages of GVF are evidently not the representative of actual conditions of the land surface vegetation cover. This study used a near-real-time (NRT) GVF data set generated from the 8-day composite of the leaf area index product from the Moderate Resolution Imaging Spectroradiometer (MODIS) to assess the impact of NRT GVF on off-line Noah LSM simulations and NWP forecast model. Simulations of the off-line Noah LSM in the Land Information System (LIS) and weather forecasts of the NASA-Unified Weather and Research Forecasting (NUWRF) were obtained using either the static multiyear average AVHRR GVF data set or the NRT MODIS GVF while meteorological forcing data and other settings were kept the same. The off-line simulations and WRF forecasts were then compared against in situ measurements or reanalysis products to assess the impact of using NRT GVF. Improvements of both soil moisture simulations as well as forecasts of 2-meter air temperature and humidity and precipitation from NUWRF were observed using the NRT GVF data products. The RMSE in SM estimates from the off-line Noah model is reduced by around 1.0% (1.41%) during the green-up phase and by 1.48% (2.24%) over the senescence phase for the surface (root zone) SM simulations. Around 82.3% validation sites (out of 1178 sites) showed positive impact on coupled WRF model with the insertion of NRT GVF.


2018 ◽  
Author(s):  
Michal Kačmařík ◽  
Jan Douša ◽  
Florian Zus ◽  
Pavel Václavovic ◽  
Kyriakos Balidakis ◽  
...  

Abstract. An analysis of processing settings impact on estimated tropospheric gradients is presented. The study is based on the benchmark data set collected within the COST GNSS4SWEC action with observations from 430 GNSS reference stations in central Europe for May and June 2013. Tropospheric gradients were estimated in eight different variants of GNSS data processing using Precise Point Positioning with the G-Nut/Tefnut software. The impact of the gradient mapping function, elevation cut-off angle, GNSS constellation and real-time versus post-processing mode were assessed by comparing the variants by each to other and by evaluating them with respect to tropospheric gradients derived from two numerical weather prediction models. Generally, all the solutions in the post-processing mode provided a robust tropospheric gradient estimation with a clear relation to real weather conditions. The quality of tropospheric gradient estimates in real-time mode mainly depends on the actual quality of the real-time orbits and clocks. Best results were achieved using the 3° elevation angle cut-off and a combined GPS + GLONASS constellation. Systematic effects of up to 0.3 mm were observed in estimated tropospheric gradients when using different gradient mapping functions which depend on the applied observation elevation-dependent weighting. While the latitudinal troposphere tilting causes a systematic difference in the north gradient component on a global scale, large local wet gradients pointing to a direction of increased humidity cause systematic differences in both gradient components depending on the gradient direction.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Ivana Vukicevic Bisevac ◽  
Natasa Vidic ◽  
Katarina Vukadinovic

This study focused on vital resources at port container terminals such as quay cranes and dockworkers. We studied the impact of incorporating the dockworker assignment problem (DWAP) into the quay crane assignment problem (QCAP). The aim of this study was to formulate and solve an integrated model for QCAP and DWAP, with the objective of minimizing the total costs of dockworkers, by optimizing workers’ assignment, so that the ships’ costs due to the time spent in the port are not increased. We proposed an integrated solution approach to the studied problem. Our proposed model has been validated on an adequate number of instances based on the real data. Obtained solutions were compared with the solutions obtained by the traditional sequential approach. It was demonstrated that, for all solved instances, our proposed integrated approach resulted in a reduction in the total costs of dockworkers. The major contribution of this study is that this is the first time that these two problems were modeled together. The obtained results show significant savings in the overall costs.


Sign in / Sign up

Export Citation Format

Share Document