scholarly journals Resolving Interoperability Issues of Precision and Array with Null Value of Web Services Using WSIG-JADE Framework

2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Jaspreet Chawla ◽  
Anil Kr Ahlawat ◽  
Jyoti Gautam

Web services and agent technology play a significant role while resolving the issues related to platform interoperability. Web service interoperability organization (WS-I) provided the guidelines to remove the interoperability issues using basic profile 1.1/1.2 product. However, issues are still arising while transferring the precision values and an array with null values between different platforms like JAVA and .NET. As in a precision issue, JAVA supports data precision up to the 6th value and .NET up to the 5th value after the decimal and after increasing their limits, the whole number gets rounded off. In array with a null value issue, JAVA treats null as a value but .NET treats null as an empty string. To remove these issues, we use the WSIG-JADE framework that helps to build and demonstrate a multiagent system that does the mapping and conversions between agents and web services. It limits the number of digits to the 5th place after the decimal thereby increasing the precision in data sets, whereas it treats null as an empty string so that string length remains the same for both the platforms thereby helping in the correct count of data elements.

1984 ◽  
Vol 7 (1) ◽  
pp. 129-150
Author(s):  
Joachim Biskup

We study operations on generalized database relations which possibly contain maybe tuples and two types of null values. The existential null value has the meaning “value at present unknown” whereas the universal null value has the meaning “value arbitrary”. For extending a usual relational operation to generalized relations we develop three requirements: adequacy, restrictedness, and feasibility. As demonstrated for the natural join as an example, we can essetially meet these requirements although we are faced with a minor tradeoff between restrictedness and feasibility.


Author(s):  
Dr. Manish L Jivtode

Web services are applications that allow for communication between devices over the internet and are independent of the technology. The devices are built and use standardized eXtensible Markup Language (XML) for information exchange. A client or user is able to invoke a web service by sending an XML message and then gets back and XML response message. There are a number of communication protocols for web services that use the XML format such as Web Services Flow Language (WSFL), Blocks Extensible Exchange Protocol(BEEP) etc. Simple Object Access Protocol (SOAP) and Representational State Transfer (REST) are used options for accessing web services. It is not directly comparable that SOAP is a communications protocol while REST is a set of architectural principles for data transmission. In this paper, the data size of 1KB, 2KB, 4KB, 8KB and 16KB were tested each for Audio, Video and result obtained for CRUD methods. The encryption and decryption timings in milliseconds/seconds were recorded by programming extensibility points of a WCF REST web service in the Azure cloud..


2018 ◽  
Vol 1 (2) ◽  
pp. 270-280 ◽  
Author(s):  
John K. Kruschke

This article explains a decision rule that uses Bayesian posterior distributions as the basis for accepting or rejecting null values of parameters. This decision rule focuses on the range of plausible values indicated by the highest density interval of the posterior distribution and the relation between this range and a region of practical equivalence (ROPE) around the null value. The article also discusses considerations for setting the limits of a ROPE and emphasizes that analogous considerations apply to setting the decision thresholds for p values and Bayes factors.


2021 ◽  
Author(s):  
Jan Michalek ◽  
Kuvvet Atakan ◽  
Christian Rønnevik ◽  
Helga Indrøy ◽  
Lars Ottemøller ◽  
...  

<p>The European Plate Observing System (EPOS) is a European project about building a pan-European infrastructure for accessing solid Earth science data, governed now by EPOS ERIC (European Research Infrastructure Consortium). The EPOS-Norway project (EPOS-N; RCN-Infrastructure Programme - Project no. 245763) is a Norwegian project funded by National Research Council. The aim of the Norwegian EPOS e‑infrastructure is to integrate data from the seismological and geodetic networks, as well as the data from the geological and geophysical data repositories. Among the six EPOS-N project partners, four institutions are providing data – University of Bergen (UIB), - Norwegian Mapping Authority (NMA), Geological Survey of Norway (NGU) and NORSAR.</p><p>In this contribution, we present the EPOS-Norway Portal as an online, open access, interactive tool, allowing visual analysis of multidimensional data. It supports maps and 2D plots with linked visualizations. Currently access is provided to more than 300 datasets (18 web services, 288 map layers and 14 static datasets) from four subdomains of Earth science in Norway. New datasets are planned to be integrated in the future. EPOS-N Portal can access remote datasets via web services like FDSNWS for seismological data and OGC services for geological and geophysical data (e.g. WMS). Standalone datasets are available through preloaded data files. Users can also simply add another WMS server or upload their own dataset for visualization and comparison with other datasets. This portal provides unique way (first of its kind in Norway) for exploration of various geoscientific datasets in one common interface. One of the key aspects is quick simultaneous visual inspection of data from various disciplines and test of scientific or geohazard related hypothesis. One of such examples can be spatio-temporal correlation of earthquakes (1980 until now) with existing critical infrastructures (e.g. pipelines), geological structures, submarine landslides or unstable slopes.  </p><p>The EPOS-N Portal is implemented by adapting Enlighten-web, a server-client program developed by NORCE. Enlighten-web facilitates interactive visual analysis of large multidimensional data sets, and supports interactive mapping of millions of points. The Enlighten-web client runs inside a web browser. An important element in the Enlighten-web functionality is brushing and linking, which is useful for exploring complex data sets to discover correlations and interesting properties hidden in the data. The views are linked to each other, so that highlighting a subset in one view automatically leads to the corresponding subsets being highlighted in all other linked views.</p>


2009 ◽  
pp. 897-918
Author(s):  
Peter Bertok ◽  
Xinjian Xu

In a rapidly changing world, continuous adoption of new practices is crucial for survival; organizations embracing the latest technologies have a competitive edge. Smart organizations readily take on board new organizational forms and practices, those in particular that offer agility and responsiveness. The Internet and the World Wide Web offer a new way of collaboration via Web services, but heterogeneity of different service components make cooperation difficult. This chapter describes a new approach to combine Web services by employing a layered structure, in which composition of a value-added service can be built from individual components, and each service component can have semantically equivalent but syntactically different alternatives.


Author(s):  
Douglas Schenck ◽  
Peter Wilson

Expressions are combinations of operators and operands which are evaluated to produce a value of a specific type. Infix operators require two operands with an operator written between them. A prefix operator requires one operand with an operator written before it. (The expression syntax starts on page 208.) Evaluation proceeds from left to right, governed by the precedence of the operators. The lowest numbered precedence as shown in Table 14.1 is evaluated first. Operators in the same row have the same precedence. Expressions enclosed by parentheses are evaluated before being treated as a single operand. An operand between two operators of different precedence is bound to the operator with the higher one; e.g., −10*20 means (−10)*20. An operand between two operators of the same precedence is bound to the one on the left; e.g., 10/20 * 30 means (10/20) * 30. Exercise 14.1 Work out the intermediate steps for this expression: … −2/(4+4)*5+6… When a null value is encountered in an expression where a non-null is expected, evaluation is short circuited and a null answer is produced. Otherwise, all expressions are fully evaluated even when the outcome is known after partial evaluation. Exercise 14.2 Can you think of an expression that does not require complete evaluation to get the correct answer? The operands of an operator must be compatible with the operator and with each other. Operands can be compatible without having identical types and are compatible when any of these conditions are satisfied: • The types are the same. • One type is a subtype of the other (e.g., one is a number and the other is an integer. • Both types are strings. • Both types are binaries. • Both types are arrays which have compatible base types and identical bounds. • Both types are bags which have compatible base types. • Both types are lists which have compatible base types. • Both types are sets which have compatible base types. Operations are organized by the kind of result they produce, namely: numeric, boolean or logical, string or binary, or aggregate.


Hereditas ◽  
2019 ◽  
Vol 156 (1) ◽  
Author(s):  
T. H. Noel Ellis ◽  
Julie M. I. Hofer ◽  
Martin T. Swain ◽  
Peter J. van Dijk

Abstract A controversy arose over Mendel’s pea crossing experiments after the statistician R.A. Fisher proposed how these may have been performed and criticised Mendel’s interpretation of his data. Here we re-examine Mendel’s experiments and investigate Fisher’s statistical criticisms of bias. We describe pea varieties available in Mendel’s time and show that these could readily provide all the material Mendel needed for his experiments; the characters he chose to follow were clearly described in catalogues at the time. The combination of character states available in these varieties, together with Eichling’s report of crosses Mendel performed, suggest that two of his F3 progeny test experiments may have involved the same F2 population, and therefore that these data should not be treated as independent variables in statistical analysis of Mendel’s data. A comprehensive re-examination of Mendel’s segregation ratios does not support previous suggestions that they differ remarkably from expectation. The χ2 values for his segregation ratios sum to a value close to the expectation and there is no deficiency of extreme segregation ratios. Overall the χ values for Mendel’s segregation ratios deviate slightly from the standard normal distribution; this is probably because of the variance associated with phenotypic rather than genotypic ratios and because Mendel excluded some data sets with small numbers of progeny, where he noted the ratios “deviate not insignificantly” from expectation.


2009 ◽  
Vol 72 (2) ◽  
pp. 260-266 ◽  
Author(s):  
JOHN R. RUBY ◽  
STEVEN C. INGHAM

Previous work using a large data set (no. 1, n = 5,355) of carcass sponge samples from three large-volume beef abattoirs highlighted the potential use of binary (present or absent) Enterobacteriaceae results for predicting the absence of Salmonella on carcasses. Specifically, the absence of Enterobacteriaceae was associated with the absence of Salmonella. We tested the accuracy of this predictive approach by using another large data set (no. 2, n = 2,163 carcasses sampled before or after interventions) from the same three data set no. 1 abattoirs over a later 7-month period. Similarly, the predictive approach was tested on smaller subsets from data set no. 2 (n = 1,087, and n = 405) and on a much smaller data set (no. 3, n = 100 postintervention carcasses) collected at a small-volume abattoir over 4 months. Of Enterobacteriaceae-negative data set no. 2 carcasses, >98% were Salmonella negative. Similarly accurate predictions were obtained in the two data subsets obtained from data set no. 2 and in data set no. 3. Of final postintervention carcass samples in data set nos. 2 and 3, 9 and 70%, respectively, were Enterobacteriaceae positive; mean Enterobacteriaceae values for the two data sets were −0.375, and 0.169 log CFU/100 cm2 (detection limit = −0.204, and Enterobacteriaceae negative assigned a value of −0.505 log CFU/100 cm2). Salmonella contamination rates for final postintervention beef carcasses in data set nos. 2 and 3 were 1.1 and 7.0%, respectively. Binary Enterobacteriaceae results may be useful in evaluating beef abattoir hygiene and intervention treatment efficacy.


2005 ◽  
Vol 44 (02) ◽  
pp. 233-238 ◽  
Author(s):  
M. C. Barba ◽  
E. Blasi ◽  
M. Cafaro ◽  
S. Fiore ◽  
M. Mirto ◽  
...  

Summary Background: In health applications, and elsewhere, 3D data sets are increasingly accessed through the Internet. To reduce the transfer time while maintaining an unaltered 3D model, adequate compression and decompression techniques are needed. Recently, Grid technologies have been integrated with Web Services technologies to provide a framework for interoperable application-to-application interaction. Objectives: The paper describes an implementation of the Edgebreaker compression technique exploiting web services technology and presents a novel approach for using such services in a Grid Portal. The Grid portal, developed at the CACT/ISUFI of the University of Lecce, allows the processing and delivery of biomedical images (CT – computerized tomography – and MRI – magnetic resonance images) in a distributed environment, using the power and security of computational Grids. Methods: The Edgebreaker Compression Web Service has been deployed on a Grid portal and allows compressing and decompressing 3D data sets using the Globus toolkit GSI (Globus Security Infrastructure) protocol. Moreover, the classical algorithm has been modified extending the compression to files containing more than one object. Results and Conclusions: An implementation of the Edgebreaker compression technique and related experimental results are presented. A novel approach for using the compression web service in a Grid portal allowing storing and preprocessing of huge 3D data sets, and subsequent efficient transmission of results for remote visualization is also described.


2013 ◽  
Vol 5 (4) ◽  
Author(s):  
K. Azizian ◽  
P. Cardou

This paper presents a method for the dimensional synthesis of fully constrained spatial cable-driven parallel mechanisms (CDPMs), namely, the problem of finding a geometry whose wrench-closure workspace (WCW) contains a prescribed workspace. The proposed method is an extension to spatial CDPMs of a synthesis method previously published by the authors for planar CDPMs. The WCW of CDPMs is the set of poses for which any wrench can be produced at the end-effector by non-negative cable tensions. A sufficient condition is introduced in order to verify whether a given six-dimensional box, i.e., a box covering point-positions and orientations, is fully inside the WCW of a given spatial CDPM. Then, a nonlinear program is formulated, whose optima represent CDPMs that can reach any point in a set of boxes prescribed by the designer. The objective value of this nonlinear program indicates how well the WCW of the resulting CDPM covers the prescribed box, a null value indicating that none of the WCW is covered and a value greater or equal to one indicating that the full prescribed workspace is covered.


Sign in / Sign up

Export Citation Format

Share Document