Exploring sensor data for agricultural statistics: The fruit is not hanging as low as we thought

2021 ◽  
pp. 1-14
Author(s):  
Ger Snijkers ◽  
Tim Punt ◽  
Sofie De Broe ◽  
José Gómez Pérez

New business processes are increasingly data driven as sensors have become ubiquitous. Sensor data could be a valuable new data source for official statistics. To study this presumption Statistics Netherlands conducted a small-scale use case in the area of agricultural statistics in collaboration with an innovative farmer. A selection of his sensor data was explored for overlap with current data demands in surveys. The aim of the study was to obtain insights in the available agricultural data, their data structure and quality, and explore new methods of data collection for agricultural statistics. The conclusion is that these data are valuable for replacing or pre-filling (parts of) certain agricultural surveys. However, many more challenges surfaced than we expected, to which the title of this paper refers. These challenges will be discussed in this paper.

Author(s):  
Nawfal El Moukhi ◽  
Ikram El Azami ◽  
Abdelaaziz Mouloudi ◽  
Abdelali Elmounadi

The data warehouse design is currently recognized as the most important and complicated phase in any project of decision support system implementation. Its complexity is primarily due to the proliferation of data source types and the lack of a standardized and well-structured method, hence the increasing interest from researchers who have tried to develop new methods for the automation and standardization of this critical stage of the project. In this paper, the authors present the set of developed methods that follows the data-driven paradigm, and they propose a new data-driven method called X-ETL. This method aims to automating the data warehouse design by generating star models from relational data. This method is mainly based on a set of rules derived from the related works, the Model-Driven Architecture (MDA) and the XML language.


2020 ◽  
Vol 114 (2) ◽  
pp. 1501-1517
Author(s):  
Ana Koren ◽  
Marko Jurčević ◽  
Ramjee Prasad
Keyword(s):  
Data Use ◽  

Author(s):  
Diana Maria Contreras Mojica ◽  
Sean Wilkinson ◽  
Philip James

Earthquakes are one of the most catastrophic natural phenomena. After an earthquake, earthquake reconnaissance enables effective recovery by collecting building damage data and other impacts. This paper aims to identify state-of-the-art data sources for building damage assessment and guide more efficient data. This paper reviews 38 articles that indicate the sources used by different authors to collect data related to damages and post-disaster recovery progress after earthquakes between 2014 and 2021. The current data collection methods have been grouped into seven categories: fieldwork or ground surveys, omnidirectional imagery (OD), terrestrial laser scanning (TLS), remote sensing (RS), crowdsourcing platforms, social media (SM) and closed-circuit television videos (CCTV). The selection of a particular data source or collection technique for earthquake reconnaissance includes different criteria. Nowadays, reconnaissance mission can not rely on a single data source, and different data sources should complement each other, validate collected data, or quantify the damage comprehensively. The recent increase in the number of crowdsourcing and SM platforms as a source of data for earthquake reconnaissance is a clear indication of the tendency of data sources in the future.


Earth ◽  
2021 ◽  
Vol 2 (4) ◽  
pp. 1006-1037
Author(s):  
Diana Contreras ◽  
Sean Wilkinson ◽  
Philip James

Earthquakes are one of the most catastrophic natural phenomena. After an earthquake, earthquake reconnaissance enables effective recovery by collecting data on building damage and other impacts. This paper aims to identify state-of-the-art data sources for building damage assessment and provide guidance for more efficient data collection. We have reviewed 39 articles that indicate the sources used by different authors to collect data related to damage and post-disaster recovery progress after earthquakes between 2014 and 2021. The current data collection methods have been grouped into seven categories: fieldwork or ground surveys, omnidirectional imagery (OD), terrestrial laser scanning (TLS), remote sensing (RS), crowdsourcing platforms, social media (SM) and closed-circuit television videos (CCTV). The selection of a particular data source or collection technique for earthquake reconnaissance includes different criteria depending on what questions are to be answered by these data. We conclude that modern reconnaissance missions cannot rely on a single data source. Different data sources should complement each other, validate collected data or systematically quantify the damage. The recent increase in the number of crowdsourcing and SM platforms used to source earthquake reconnaissance data demonstrates that this is likely to become an increasingly important data source.


Author(s):  
Diana Maria Contreras Mojica ◽  
Sean Wilkinson ◽  
Philip James

Earthquakes are one of the most catastrophic natural phenomena. After an earthquake, earthquake reconnaissance enables effective recovery by collecting building damage data and other impacts. This paper aims to identify state-of-the-art data sources for building damage assessment and provide guidance for more efficient data collection. We have reviewed 38 articles that indicate the sources used by different authors to collect data related to damage and post-disaster recovery progress after earthquakes between 2014 and 2021. The current data collection methods have been grouped into seven categories: fieldwork or ground surveys, omnidirectional imagery (OD), terrestrial laser scanning (TLS), remote sensing (RS), crowdsourcing platforms, social media (SM) and closed-circuit television videos (CCTV). The selection of a particular data source or collection technique for earthquake reconnaissance includes different criteria depending on what questions are to be answered by this data. We conclude that modern reconnaissance missions can not rely on a single data source and that different data sources should complement each other, validate collected data, or systematically quantify the damage. The recent increase in the number of crowdsourcing and SM platforms used to source earthquake reconnaissance data demonstrates that this is likely to become an increasingly important source of data.


2005 ◽  
Vol 7 ◽  
pp. 1-7
Author(s):  
Kai Sørensen

The Review of Survey activities presents a selection of 18 papers reflecting the wide spectrum of activities of the Geological Survey of Denmark and Greenland, from the microbial to the plate tectonic level.Activities in Denmark: The Survey's activities in Denmark are documented by 11 papers. The main themes are petroleum- and groundwater-related topics and Quaternary geology but neotectonics of the Baltic Shield and new methods in provenance studies of sandstones are also touched upon.Activities in Greenland: The Survey's activities in Greenland and the North Atlantic are covered by six articles focusing on climate research, the mineral potential of the Precambrian basement terranes in West Greenland and on the possibility of exploiting dimension stones.Other countries: During 2004, the Survey carried out work in more than 20 countries outside Denmark, Greenland and the Faroe Islands. In this report a project on developing small-scale mining in Mongolia and Kyrgyzstan is described.


2020 ◽  
Vol 23 (11) ◽  
pp. 1269-1290
Author(s):  
A.A. Turgaeva

Subject. This article analyzes the business processes in the insurance company, using the method of their operation with the selection of key areas of activity. Objectives. The article aims to describe certain business processes in insurance, highlighting participants, lines of activity, and the sequence of procedures. It analyzes the business process Settlement of Losses, which is one of the significant business processes in the insurance company. Methods. For the study, I used the methods of induction and deduction, analogy, and the systems approach. Results. Based on the analysis and description of business processes in the insurance company and the identification of key elements and steps in terms of the effectiveness of decisions, the article identifies the checkpoints of Entry and Exit, activity direction, and resources of the Settlement of Losses process. Conclusions. The application of the categories that split business processes makes it possible to develop step regulation for all processes and acceptable control procedures for different operations. The presented checkpoints at different steps of the business process will help identify weaknesses and eliminate them by re-checking the point.


2007 ◽  
Vol 158 (8) ◽  
pp. 235-242 ◽  
Author(s):  
Hans Rudolf Heinimann

The term «precision forestry» was first introduced and discussed at a conference in 2001. The aims of this paper are to explore the scientific roots of the precision concept, define «precision forestry», and sketch the challenges that the implementation of this new concept may present to practitioners, educators, and researchers. The term «precision» does not mean accuracy on a small scale, but instead refers to the concurrent coordination and control of processes at spatial scales between 1 m and 100 km. Precision strives for an automatic control of processes. Precision land use differs from precision engineering by the requirements of gathering,storing and managing spatio-temporal variability of site and vegetation parameters. Practitioners will be facing the challenge of designing holistic, standardized business processes that are valid for whole networks of firms,and that follow available standards (e.g., SCOR, WoodX). There is a need to educate and train forestry professionals in the areas of business process re-engineering, computer supported management of business transactions,methods of remote sensing, sensor technology and control theory. Researchers will face the challenge of integrating plant physiology, soil physics and production sciences and solving the supply chain coordination problem (SCCP).


2020 ◽  
Vol 22 (4) ◽  
pp. 27-36
Author(s):  
OLGA A. TOLPEGINA ◽  
◽  
EKATERINA I. RUDENKO ◽  

The article proposes a methodology for assessing the innovative activity of a company, one of the areas of values of state corporations: «Innovation, innovative development, the ability to upgrade». To evaluate the effectiveness, the principle of decomposition of a global goal was used with its replacement for individual specific tasks according to the designated functional subsystems and objects (blocks) of assessment, which together give a generalized description of technological, technical innovations, their development and use, implementation of the latest digital information technologies, results intellectual research, the development of new business processes, management methods, organizational forms in business practice, as well as ability to sustainable renovation, improvement and prospects for innovative growth of the company and its sustainable renewal.The scoring methodology using the developed criteria boundaries of efficiency from ambitious to low efficiency and with assignment of significance scales by expert means involves the inclusion in each assessment block of six to fifteen traditional and composite author’s indicators, the complexity of which is determined by the complexity of the subject of the study and the described process. The methodology is universal in nature, can be used for large corporations and small companies according to a reduced set of indicators, it can be used in determining ratings.


Author(s):  
Laure Fournier ◽  
Lena Costaridou ◽  
Luc Bidaut ◽  
Nicolas Michoux ◽  
Frederic E. Lecouvet ◽  
...  

Abstract Existing quantitative imaging biomarkers (QIBs) are associated with known biological tissue characteristics and follow a well-understood path of technical, biological and clinical validation before incorporation into clinical trials. In radiomics, novel data-driven processes extract numerous visually imperceptible statistical features from the imaging data with no a priori assumptions on their correlation with biological processes. The selection of relevant features (radiomic signature) and incorporation into clinical trials therefore requires additional considerations to ensure meaningful imaging endpoints. Also, the number of radiomic features tested means that power calculations would result in sample sizes impossible to achieve within clinical trials. This article examines how the process of standardising and validating data-driven imaging biomarkers differs from those based on biological associations. Radiomic signatures are best developed initially on datasets that represent diversity of acquisition protocols as well as diversity of disease and of normal findings, rather than within clinical trials with standardised and optimised protocols as this would risk the selection of radiomic features being linked to the imaging process rather than the pathology. Normalisation through discretisation and feature harmonisation are essential pre-processing steps. Biological correlation may be performed after the technical and clinical validity of a radiomic signature is established, but is not mandatory. Feature selection may be part of discovery within a radiomics-specific trial or represent exploratory endpoints within an established trial; a previously validated radiomic signature may even be used as a primary/secondary endpoint, particularly if associations are demonstrated with specific biological processes and pathways being targeted within clinical trials. Key Points • Data-driven processes like radiomics risk false discoveries due to high-dimensionality of the dataset compared to sample size, making adequate diversity of the data, cross-validation and external validation essential to mitigate the risks of spurious associations and overfitting. • Use of radiomic signatures within clinical trials requires multistep standardisation of image acquisition, image analysis and data mining processes. • Biological correlation may be established after clinical validation but is not mandatory.


Sign in / Sign up

Export Citation Format

Share Document