scholarly journals A comparison of tetrad and triangle test: case study on sweetener products using consumer panels

Author(s):  
D R Adawiyah ◽  
L Guntari ◽  
V S Smaratika ◽  
Lince
Keyword(s):  
2021 ◽  
Vol 13 (12) ◽  
pp. 6879
Author(s):  
Hassan P. Ebrahimi ◽  
R. Sandra Schillo ◽  
Kelly Bronson

This study provides a model that supports systematic stakeholder inclusion in agricultural technology. Building on the Responsible Research and Innovation (RRI) literature and attempting to add precision to the conversation around inclusion in technology design and governance, this study develops a framework for determining which stakeholder groups to engage in RRI processes. We developed the model using a specific industry case study: identifying the relevant stakeholders in the Canadian digital agriculture ecosystem. The study uses literature and news article analysis to map stakeholders in the Canadian digital agricultural sector as a test case for the model. The study proposes a systematic framework which categorises stakeholders into individuals, industrial and societal groups with both direct engagement and supportive roles in digital agriculture. These groups are then plotted against three levels of impact or power in the agri-food system: micro, meso and macro.


2016 ◽  
Vol 25 (02) ◽  
pp. 1650027 ◽  
Author(s):  
Giovanni Amelino-Camelia ◽  
Giulia Gubitosi ◽  
Giovanni Palmisano

Several arguments suggest that the Planck scale could be the characteristic scale of curvature of momentum space. As other recent studies, we assume that the metric of momentum space determines the condition of on-shellness while the momentum space affine connection governs the form of the law of composition of momenta. We show that the possible choices of laws of composition of momenta are more numerous than the possible choices of affine connection on a momentum space. This motivates us to propose a new prescription for associating an affine connection to momentum composition, which we compare to the one most used in the recent literature. We find that the two prescriptions lead to the same picture of the so-called [Formula: see text]-momentum space, with de Sitter (dS) metric and [Formula: see text]-Poincaré connection. We then show that in the case of “proper dS momentum space”, with the dS metric and its Levi–Civita connection, the two prescriptions are inequivalent. Our novel prescription leads to a picture of proper dS momentum space which is DSR-relativistic and is characterized by a commutative law of composition of momenta, a possibility for which no explicit curved momentum space picture had been previously found. This momentum space can serve as laboratory for the exploration of the properties of DSR-relativistic theories which are not connected to group-manifold momentum spaces and Hopf algebras, and is a natural test case for the study of momentum spaces with commutative, and yet deformed, laws of composition of momenta.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Ali M. Alakeel

Program assertions have been recognized as a supporting tool during software development, testing, and maintenance. Therefore, software developers place assertions within their code in positions that are considered to be error prone or that have the potential to lead to a software crash or failure. Similar to any other software, programs with assertions must be maintained. Depending on the type of modification applied to the modified program, assertions also might have to undergo some modifications. New assertions may also be introduced in the new version of the program, while some assertions can be kept the same. This paper presents a novel approach for test case prioritization during regression testing of programs that have assertions using fuzzy logic. The main objective of this approach is to prioritize the test cases according to their estimated potential in violating a given program assertion. To develop the proposed approach, we utilize fuzzy logic techniques to estimate the effectiveness of a given test case in violating an assertion based on the history of the test cases in previous testing operations. We have conducted a case study in which the proposed approach is applied to various programs, and the results are promising compared to untreated and randomly ordered test cases.


Author(s):  
Juan C. Burguillo-Rial ◽  
Manuel J. Fernández-Iglesias ◽  
Francisco J. González-Castaño ◽  
Martín Llamas-Nistal

2015 ◽  
Vol 12 (12) ◽  
pp. 13217-13256 ◽  
Author(s):  
G. Formetta ◽  
G. Capparelli ◽  
P. Versace

Abstract. Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC) coupled with model M3 is the best modeling solution for our test case.


2017 ◽  
Vol 5 (2) ◽  
Author(s):  
Nurul Priyantari ◽  
Supriyadi . ◽  
Devi Putri Sulistiani ◽  
Winda Aprita Mayasari

2D geoelectrical resitivity measurement and direct shear test has been conducted to determine soil type and soil strength on land settlement Istana Tidar Regency housing, Jember. Resistivity measurement is conducted at two line that have latitude 08.10’102” – 08.10’108” S, 113.43’404” – 113.43’408” E (line 1) dan 08.10’102” – 08.10’108” S, 113.43’410” – 113.43’414” E (line 2). Soil specimen were taken at 3 point, 2 point at line 1 and 1 point at line 2. Based on result of 2D geoelectrical resistivity measurement and direct shear test, this location was dominated by clay, silt and sandy silt are included in the type of cohesive soils. Soil strength of this type is capable to support light bulding contruction one or two floors.


2021 ◽  
Author(s):  
Olivier Coutant ◽  
Ludovic Moreau ◽  
Pierre Boué ◽  
Eric Larose ◽  
Arnaud Cimolino

<p>Accurate monitoring of floating ice thickness is an important safety issue for northern countries where lakes, fjords, and coasts are covered with ice in winter, and used by people to travel. For example in Finland, 15-20 fatal accidents occur every year due to ice-related drowning. We have explored the potential of fiber optics to measure the propagation of seismic waves guided in the ice layer, in order to infer its thickness via the inversion of the dispersion curves. An optical fiber was deployed on a frozen lake at Lacs Roberts (2400m) above Grenoble and we measured with a DAS the signal generated by active sources (hammer) and ambient noise. We demonstrate that we can retrieve the ice thickness. This monitoring method could be of interest since the deployment of a fiber on ice is quite simple (e.g. using a drone) compared to other techniques for ice thickness estimation such as seismic survey or manual drilling.</p>


1998 ◽  
Vol 12 (3) ◽  
pp. 283-302 ◽  
Author(s):  
James Allen Fill

The elementary problem of exhaustively sampling a finite population without replacement is used as a nonreversible test case for comparing two recently proposed MCMC algorithms for perfect sampling, one based on backward coupling and the other on strong stationary duality. The backward coupling algorithm runs faster in this case, but the duality-based algorithm is unbiased for user impatience. An interesting by-product of the analysis is a new and simple stochastic interpretation of a mixing-time result for the move-to-front rule.


Author(s):  
Saeema Ahmed ◽  
Sanghee Kim ◽  
Ken M. Wallace

This paper describes a methodology for developing ontologies for engineering design. The methodology combines a number of methods from social science and computer science, together with taxonomies developed in the field of engineering design. A case study is used throughout the paper focusing upon the use of an ontology for searching, indexing and retrieving of engineering knowledge. An ontology for indexing design knowledge can assist the users to formulate their queries when searching for engineering design knowledge. The root concepts of the ontology were elicited from engineering designers during an empirical research study. These formed individual taxonomies within the ontology and were validated through indexing a set of ninety-two documents. Relationships between concepts are extracted as the ontology is populated with instances. The identified root concepts were found to be complete and sufficient for the purpose of indexing. A thesaurus and an automatic classification are being developed as a result of this evaluation. The methodology employed during the test case is presented in this paper. There are six separate stages, which are presented together with the research methods employed for each stage and the evaluation of each stage. The main contribution of this research is the development of a methodology to allow researchers and industry to create ontologies for their particular purpose and to develop a thesaurus for the terms within the ontology. The methodology is based upon empirical research and hence, focuses upon understanding a user’s domain models as opposed to extracting an ontology from documentation.


Sign in / Sign up

Export Citation Format

Share Document