scholarly journals The Fundamental of TCP Techniques

2021 ◽  
Author(s):  
Pritee Nivrutti Hulule

Strategies for prioritizing test cases plan test cases to reduce the cost of retrospective testing and to enhance a specific objective function. Test cases are prioritized as those most important test cases under certain conditions are made before the re-examination process. There are many strategies available in the literature that focus on achieving various pre-test testing objectives and thus reduce their cost. In addition, inspectors often select a few well-known strategies for prioritizing trial cases. The main reason behind the lack of guidelines for the selection of TCP strategies. Therefore, this part of the study introduces the novel approach to TCP strategic planning using the ambiguous concept to support the effective selection of experimental strategies to prioritize experimental cases. This function is an extension of the already selected selection schemes for the prioritization of probation cases.

Sensors ◽  
2021 ◽  
Vol 21 (23) ◽  
pp. 8010
Author(s):  
Ismail Butun ◽  
Yusuf Tuncel ◽  
Kasim Oztoprak

This paper investigates and proposes a solution for Protocol Independent Switch Architecture (PISA) to process application layer data, enabling the inspection of application content. PISA is a novel approach in networking where the switch does not run any embedded binary code but rather an interpreted code written in a domain-specific language. The main motivation behind this approach is that telecommunication operators do not want to be locked in by a vendor for any type of networking equipment, develop their own networking code in a hardware environment that is not governed by a single equipment manufacturer. This approach also eases the modeling of equipment in a simulation environment as all of the components of a hardware switch run the same compatible code in a software modeled switch. The novel techniques in this paper exploit the main functions of a programmable switch and combine the streaming data processor to create the desired effect from a telecommunication operator perspective to lower the costs and govern the network in a comprehensive manner. The results indicate that the proposed solution using PISA switches enables application visibility in an outstanding performance. This ability helps the operators to remove a fundamental gap between flexibility and scalability by making the best use of limited compute resources in application identification and the response to them. The experimental study indicates that, without any optimization, the proposed solution increases the performance of application identification systems 5.5 to 47.0 times. This study promises that DPI, NGFW (Next-Generation Firewall), and such application layer systems which have quite high costs per unit traffic volume and could not scale to a Tbps level, can be combined with PISA to overcome the cost and scalability issues.


1994 ◽  
Vol 49 (9-10) ◽  
pp. 553-560 ◽  
Author(s):  
Wolf-Rainer Abraham

Abstract Biotransformation of isopinocampheol with 100 bacterial and fungal strains yielded 1-, 2-, 4-, 5-, 7-, 8 - and 9-hydroxy-isopinocampheol besides three rearranged monoterpenes, one of them bearing the novel isocarane skeleton. A pronounced enantioselectivity between (+)- and (-)-isopinocampheol was observed. The phylogenetic position of the individual strains could be seen in their ability to form the products from (+)-isopinocampheol. The formation of 1,3-dihydroxypinane is a domain of bacteria, while 3,5- or 3,7-dihydroxypinane was mainly formed by fungi, especially those of the phylum Zygomycotina. The activity of Basidiomycotina towards oxidation of isopinocampheol was rather low. Such informations can be used in a more effective selection of strains for screening.


Author(s):  
Gül Gökay Emel ◽  
Gülcan Petriçli

In the late 1980s, the proportion of outsourced materials in the cost of high-tech products was around 80%. In this respect, with increasing globalization and ever-expanding supply chains, interdependencies between organizations have increased and the selection of suppliers has become more important than ever. This exploratory research study intends to develop a novel approach for a specific type of supplier selection problem which is supplier pre-evaluation. A two-staged multi-layered feed forward neural networks (NN) algorithm for pattern recognition was used to pre-evaluate suppliers under strategy-based organizational and technical criteria. Data for training, validation and testing the network were collected from a global Tier-1 manufacturing company in the automotive industry. The results show that the proposed approach is able to classify candidate suppliers into three separate groups of risky, potential or preferred. With this classification, it becomes feasible to eliminate risky suppliers before doing business with them.


2019 ◽  
Vol 22 (64) ◽  
pp. 47-62
Author(s):  
Mariela Morveli Espinoza ◽  
Juan Carlos Nieves ◽  
Ayslan Possebom ◽  
Cesar Augusto Tacla

By considering rational agents, we focus on the problem of selecting goals out of a set of incompatible ones. We consider three forms of incompatibility introduced by Castelfranchi and Paglieri, namely the terminal, the instrumental (or based on resources), and the superfluity. We represent the agent's plans by means of structured arguments whose premises are pervaded with uncertainty. We measure the strength of these arguments in order to determine the set of compatible goals. We propose two novel ways for calculating the strength of these arguments, depending on the kind of incompatibility thatexists between them. The first one is the logical strength value, it is denoted by a three-dimensional vector, which is calculated from a probabilistic interval associated with each argument. The vector represents the precision of the interval, the location of it, and the combination of precision and location. This type of representation and treatment of the strength of a structured argument has not been defined before by the state of the art. The second way for calculating the strength of the argument is based on the cost of the plans (regarding the necessary resources) and the preference of the goals associated with the plans. Considering our novel approach for measuring the strength of structured arguments, we propose a semantics for the selection of plans and goals that is based on Dung's abstract argumentation theory. Finally, we make a theoretical evaluation of our proposal.


2015 ◽  
Vol 33 (1) ◽  
pp. 243-250
Author(s):  
Katarzyna Pietrucha-Urbanik

Abstract The aim of this paper is to present the novel approach to risk assessment in combination with failure and consequence analysis, depending on two parameters defined by the fuzzy functions: the repair time of water pipe and the cost of water pipe repair, what allow to determine particular risk levels. The presented methodology can be used to describe the functioning of the public water supply in terms of its renewal.


2010 ◽  
Vol 439-440 ◽  
pp. 183-188 ◽  
Author(s):  
Wei Fang ◽  
Zhi Ming Cui

The main problems in Web Pages classification are lack of labeled data, as well as the cost of labeling the unlabeled data. In this paper we discuss the application of semi-supervised machine learning method co-training on classification of Deep Web query interfaces to boost the performance of a classifier. Then, Bayes and Maxim Entropy algorithm are co-operated to incorporate labeled data with unlabeled data in training process incrementally. Our experiment results show the novel approach has a promising performance.


Author(s):  
JOÃO W. CANGUSSU ◽  
KENDRA COOPER ◽  
W. ERIC WONG

Component-based software development techniques are being adopted to rapidly deploy complex, high quality systems. One of its aspects is the selection of components that realize the specified requirements. In addition to the functional requirements, the selection must be done taking into account some non-functional requirements such as performance, reliability, and usability. Hence, data that characterize the non-functional behavior of the components is needed; a test set is needed to collect this data for each component under consideration. This set may be large, which results in a considerable increase in the cost of the development process. Here, a process is proposed to considerably reduce the number of test cases used in the performance evaluation of components. The process is based on sequential curve fittings from an incremental number of test cases until a minimal pre-specified residual error is achieved. The incremental selection of test cases is done in two different ways: randomly and adaptively. The accuracy and performance of the proposed approach are dependent on the values of the desired residual error. The smaller the residual error, the higher the accuracy. However, performance has an opposite behavior. The smaller the error, the larger the number of test cases needed. The results from experiments with image compression components are a clear indication that a reduction in the number of test cases can be achieved while maintaining reasonable accuracy when using the proposed approach.


Author(s):  
Antonio de Falco ◽  
Zoltan Dezso ◽  
Francesco Ceccarelli ◽  
Luigi Cerulo ◽  
Angelo Ciaramella ◽  
...  

Abstract Motivation The cost of drug development has dramatically increased in the last decades, with the number new drugs approved per billion US dollars spent on R&D halving every year or less. The selection and prioritization of targets is one the the most influential decisions in drug discovery. Here we present a Gaussian Process model for the prioritization of drug targets cast as a problem of learning with only positive and unlabeled examples. Results Since the absence of negative samples does not allow standard methods for automatic selection of hyperparameters, we propose a novel approach for hyperparameter selection of the kernel in One Class Gaussian Processes. We compare our methods with state-of-the-art approaches on benchmark datasets and then show its application to druggability prediction of oncology drugs. Our score reaches an AUC 0.90 on a set of clinical trial targets starting from a small training set of 102 validated oncology targets. Our score recovers the majority of known drug targets and can be used to identify novel set of proteins as drug target candidates. Availability Source code implemented in Python is freely available for download at https://github.com/AntonioDeFalco/Adaptive-OCGP. Supplementary information Supplementary data are available at Bioinformatics online.


Author(s):  
Bin Zhang ◽  
Tong Wang ◽  
Chuan-gang Gu ◽  
Zheng-yuan Dai

In large eddy simulation (LES), the filtering grid scale (FGS) of LES equations is calculated generally by local mesh size. Therefore, proper LES Meshing is very decisive for better results and more economical cost. An effort was made to provide an available approach for LES meshing by turbulence theory and CFD methods. The expression for proper filtering grid scale (PFGS) was proposed on the basis of −5/3 law of inertial sub-range. A new parameter named grid ratio coefficient was put forward for the mesh adjustment. The proper mesh of LES could be built directly from the adjustment of RANS mesh. Two test cases both backward facing step flow and turbulent channel flow were provided to verify the approach. There were three kinds of mesh size, including coarse mesh for RANS (RCM), adjusted mesh for LES with the novel approach (NAM) and fine mesh for LES (LFM), employed here. The grid numbers of NAM were less than those of LFM evidently, and the results of NAM were in a good agreement with those of DNS and experiments. It was also revealed that results of NAM were very close to those of LFM. The conclusions provided positive evidences in the application of the approach.


2013 ◽  
Vol 483 ◽  
pp. 647-651
Author(s):  
Guang Wei Chen

Mechanical lifting garage is designed to achieve a variety of vehicles parked automatically and scientific storage storage facilities. Because the traditional design need one drives for each plate, it waste energy and increase the cost. This paper introduces the course of development of the lifting garage, status and various types of three-dimensional garage features. Then it takes two floors with five parking spaces lifting garage as an example, describing the working principle, the mechanical analysis, the main connection point of framework, and the driveline design. The design innovative use "one drive for multi-plate", reasonable design to achieve the lifting of the garage and pan. In addition, in order to ensure the safety of vehicles and personnel of the design also incorporates the gantry fall protection and other safety devices.


Sign in / Sign up

Export Citation Format

Share Document