scholarly journals Input data needed for a risk model for the entry, establishment and spread of a pathogen (Phomopsis vaccinii) of blueberries and cranberries in the EU

2018 ◽  
Vol 172 (2) ◽  
pp. 126-147 ◽  
Author(s):  
A.H.C. van Bruggen ◽  
J.S. West ◽  
W. van der Werf ◽  
R.P.J. Potting ◽  
C. Gardi ◽  
...  
2020 ◽  
Vol 9 (2) ◽  
pp. 121 ◽  
Author(s):  
Kavisha Kumar ◽  
Hugo Ledoux ◽  
Richard Schmidt ◽  
Theo Verheij ◽  
Jantien Stoter

This paper presents our implementation of a harmonized data model for noise simulations in the European Union (EU). Different noise assessment methods are used by different EU member states (MS) for estimating noise at local, regional, and national scales. These methods, along with the input data extracted from the national registers and databases, as well as other open and/or commercially available data, differ in several aspects and it is difficult to obtain comparable results across the EU. To address this issue, a common framework for noise assessment methods (CNOSSOS-EU) was developed by the European Commission’s (EC) Joint Research Centre (JRC). However, apart from the software implementations for CNOSSOS, very little has been done for the practical guidelines outlining the specifications for the required input data, metadata, and the schema design to test the real-world situations with CNOSSOS. We describe our approach for modeling input and output data for noise simulations and also generate a real world dataset of an area in the Netherlands based on our data model for simulating urban noise using CNOSSOS.


Author(s):  
Gunnar Weigold ◽  
Colin Argent ◽  
John Healy ◽  
Ian Diggory

ROSEN have developed together with MACAW Engineering Ltd. a Risk Assessment Tool that can be applied to both piggable and un-piggable pipelines. The Risk Model is structured to answer three basic questions relating to pipeline integrity: • What threats are active on the pipeline? • Will the active threats result in a leak or a rupture? • What is the company liability (cost) in the event of a failure? The risk assessment criteria on which the model is based are taken from codes and technical papers that have become accepted as industry norms. The Risk Model itself is semi-quantitative and is based on input data that operators should have for all pipelines. The results of the risk assessment provide an objective identification of active threats to pipeline integrity and a first level benchmarking of the operators procedures with regards to industry best practice. The paper will present the fast and robust Risk Assessment Approach and illustrate it’s application by different examples as it was used to identify and prioritize active threats mechanism to optimize maintenance expenditures for effective preservation of pipeline integrity.


2014 ◽  
Vol 35 (2) ◽  
pp. 233-248 ◽  
Author(s):  
Anna Skorek-Osikowska ◽  
Łukasz Bartela ◽  
Janusz Kotowicz

Abstract The paper presents the basic input data and modelling results of IGCC system with membrane CO2 capture installation and without capture. The models were built using commercial software (Aspen and GateCycle) and with the use of authors’ own computational codes. The main parameters of the systems were calculated, such as gross and net power, auxiliary power of individual installations and efficiencies. The models were used for the economic and ecological analysis of the systems. The Break Even Point method of analysis was used. The calculations took into account the EU emissions trading scheme. Sensitivity analysis on the influence of selected quantities on break-even price of electricity was performed


2021 ◽  
Vol 21 (4) ◽  
pp. 193-207
Author(s):  
Zbigniew Binderman ◽  
Bolesław Borkowski ◽  
Wiesław Szczesny ◽  
Rafał Zbyrowski

The problem of building a stable synthetic index used to organize the objects described with multiple partial indexes has been and is the subject of the authors' considerations. In our research, we aimed at the construction of a measure that would be insensitive to the input data, the results that would not depend on the method of normalization of the variables, the choice of the distance measure (similarity) and the selection of features. Our experience is consistent with the results of the work of the outstanding American statistician L. Breiman. A single classifier (synthetic measure) may be far from optimal, while combinations of many result in a classifier that is close to optimal and stable. Unfortunately, when "weak" classifiers are used, the combination may result in an even worse classifier. In the work, we presented the consequences of choosing a synthetic indicator for organizing objects described by many features on a practical example of farms in the EU countries covered by the FADN survey.


Author(s):  
R.A. Ploc ◽  
G.H. Keech

An unambiguous analysis of transmission electron diffraction effects requires two samplings of the reciprocal lattice (RL). However, extracting definitive information from the patterns is difficult even for a general orthorhombic case. The usual procedure has been to deduce the approximate variables controlling the formation of the patterns from qualitative observations. Our present purpose is to illustrate two applications of a computer programme written for the analysis of transmission, selected area diffraction (SAD) patterns; the studies of RL spot shapes and epitaxy.When a specimen contains fine structure the RL spots become complex shapes with extensions in one or more directions. If the number and directions of these extensions can be estimated from an SAD pattern the exact spot shape can be determined by a series of refinements of the computer input data.


2007 ◽  
Vol 6 (1) ◽  
pp. 46-46
Author(s):  
L FRANKENSTEIN ◽  
L INGLE ◽  
A REMPPIS ◽  
D SCHELLBERG ◽  
C SIGG ◽  
...  

2013 ◽  
Author(s):  
Rinus van Schendelen
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document