scholarly journals CoVeriTest with Adaptive Time Scheduling (Competition Contribution)

Author(s):  
Marie-Christine Jakobs ◽  
Cedric Richter

AbstractCoVeriTest, which is integrated in the analysis framework CPAchecker, adopts verification technology for test-case generation. It encodes individual test goals as reachability queries, which are then processed by verifiers. To increase the effectiveness on a broad class of testing tasks, CoVeriTest leverages the strengths of two different analyses: an explicit value analysis and predicate abstraction. Similar to TestComp’20, the two analyses are interleaved and the time duration of an interleaving segment is calculated dynamically. However, the calculation of the time duration focuses on the predicted future performance instead of the past performance, thus, rewarding analyses that likely cover open test goals.

2013 ◽  
Vol 217 (3) ◽  
pp. S144
Author(s):  
Meera Gupta ◽  
Malorie Snider ◽  
Alexander Wood ◽  
Peter L. Abt ◽  
Matthew H. Levine

Water ◽  
2018 ◽  
Vol 10 (11) ◽  
pp. 1519 ◽  
Author(s):  
Paul Muñoz ◽  
Johanna Orellana-Alvear ◽  
Patrick Willems ◽  
Rolando Célleri

Flash-flood forecasting has emerged worldwide due to the catastrophic socio-economic impacts this hazard might cause and the expected increase of its frequency in the future. In mountain catchments, precipitation-runoff forecasts are limited by the intrinsic complexity of the processes involved, particularly its high rainfall variability. While process-based models are hard to implement, there is a potential to use the random forest algorithm due to its simplicity, robustness and capacity to deal with complex data structures. Here a step-wise methodology is proposed to derive parsimonious models accounting for both hydrological functioning of the catchment (e.g., input data, representation of antecedent moisture conditions) and random forest procedures (e.g., sensitivity analyses, dimension reduction, optimal input composition). The methodology was applied to develop short-term prediction models of varying time duration (4, 8, 12, 18 and 24 h) for a catchment representative of the Ecuadorian Andes. Results show that the derived parsimonious models can reach validation efficiencies (Nash-Sutcliffe coefficient) from 0.761 (4-h) to 0.384 (24-h) for optimal inputs composed only by features accounting for 80% of the model’s outcome variance. Improvement in the prediction of extreme peak flows was demonstrated (extreme value analysis) by including precipitation information in contrast to the use of pure autoregressive models.


1994 ◽  
Vol 79 (1) ◽  
pp. 24-26 ◽  
Author(s):  
Edward R. Hirt ◽  
Kenneth R. Ryalls

We suggest that the positivity bias found by Wann and Dolan in highly allegiant sports fans might be better explained by considering the mediating role of self-esteem in both the prediction of a team's future performance as well as the recall of past performance.


2017 ◽  
Vol 52 (6) ◽  
pp. 2755-2777 ◽  
Author(s):  
Howard Jones ◽  
Jose Vicente Martinez

Using survey data, we analyze institutional investors’ expectations about the future performance of fund managers and the impact of those expectations on asset allocation decisions. We find that institutional investors allocate funds mainly on the basis of fund managers’ past performance and of investment consultants’ recommendations, but not because they extrapolate their expectations from these. This suggests that institutional investors base their investment decisions on the most defensible variables at their disposal and supports the existence of agency considerations in their decision making.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Eszter Fehér ◽  
Gábor Domokos ◽  
Bernd Krauskopf

<p style='text-indent:20px;'>We are concerned with the evolution of planar, star-like curves and associated shapes under a broad class of curvature-driven geometric flows, which we refer to as the Andrews-Bloore flow. This family of flows has two parameters that control one constant and one curvature-dependent component for the velocity in the direction of the normal to the curve. The Andrews-Bloore flow includes as special cases the well known Eikonal, curve-shortening and affine shortening flows, and for positive parameter values its evolution shrinks the area enclosed by the curve to zero in finite time. A question of key interest has been how various shape descriptors of the evolving shape behave as this limit is approached. Star-like curves (which include convex curves) can be represented by a periodic scalar polar distance function <inline-formula><tex-math id="M1">\begin{document}$ r(\varphi) $\end{document}</tex-math></inline-formula> measured from a reference point, which may or may not be fixed. An important question is how the numbers and the trajectories of critical points of the distance function <inline-formula><tex-math id="M2">\begin{document}$ r(\varphi) $\end{document}</tex-math></inline-formula> and of the curvature <inline-formula><tex-math id="M3">\begin{document}$ \kappa(\varphi) $\end{document}</tex-math></inline-formula> (characterized by <inline-formula><tex-math id="M4">\begin{document}$ dr/d\varphi = 0 $\end{document}</tex-math></inline-formula> and <inline-formula><tex-math id="M5">\begin{document}$ d\kappa /d\varphi = 0 $\end{document}</tex-math></inline-formula>, respectively) evolve under the Andrews-Bloore flows for different choices of the parameters.</p><p style='text-indent:20px;'>We present a numerical method that is specifically designed to meet the challenge of computing accurate trajectories of the critical points of an evolving curve up to the vicinity of a limiting shape. Each curve is represented by a piecewise polynomial periodic radial distance function, as determined by a chosen mesh; different types of meshes and mesh adaptation can be chosen to ensure a good balance between accuracy and computational cost. As we demonstrate with test-case examples and two longer case studies, our method allows one to perform numerical investigations into subtle questions of planar curve evolution. More specifically — in the spirit of experimental mathematics — we provide illustrations of some known results, numerical evidence for two stated conjectures, as well as new insights and observations regarding the limits of shapes and their critical points.</p>


2021 ◽  
Author(s):  
Torben Eggers ◽  
Jens Friedrichs ◽  
Jan Goessling ◽  
Joerg R. Seume ◽  
Nunzio Natale ◽  
...  

Abstract In the CA3ViAR (Composite fan Aerodynamic, Aeroelastic, and Aeroacoustic Validation Rig) project, a composite low-transonic fan is designed and tested. The aim is a scaled ultra-high bypass ratio (UHBR) fan with state-of-the-art aerodynamic performance and composite rotor blades, which features aeroelastic phenomena, e.g. forced response by inlet distortions and flutter, under certain operating points within the wind tunnel. In this paper, the aerodynamic and aeroelastic design process starting from the overall performance specifications to a threedimensional numerical model is described. A target of eigen-frequency and twist-to-plunge ratio is specified such that flutter occurs at desired operating conditions with a sufficient margin with respect to the working line. Different materials and layups of the composite blade are analyzed to reach the structural target. The fan should serve as an open test case to advance the future research on aerodynamic, aeroelastic, and aeroacoustic performance investigations in a wide range of operating conditions. A preliminary fan stage design is presented in this paper.


2020 ◽  
Vol 3 (2) ◽  
pp. 64
Author(s):  
Fahmi Ahmad Fauzi ◽  
Gunawan Eka Putra ◽  
Supriyanto Supriyanto ◽  
Nurul Asia Saputra ◽  
Teti Desyani

Testing is a process in implementing a program for the purpose of finding / looking for an error. A test case is said to be good is if the test to be performed has the possibility of finding an error that cannot be revealed. While testing is said to be successful if the test to be carried out successfully found an error that originally could not be found. This research was tested by applying the Black Boxt Testing method. There are several ways in the Black Box Testing method, namely Boundary Value Analysis, Comparison Testing, Sample Testing, Robustness Testing, Equivalence Partitioning and others. In this study using testing by Equivalence Partitioning. Equivalence Partitioning Analysis is a case / problem that will be tested by finding several errors and reducing / minimizing the number of cases that must be made. This test is performed on the Parking Management Application function to facilitate managing parking. The test results indicate the application is in accordance with needs.


2013 ◽  
Vol 1 (1) ◽  
pp. 171-181
Author(s):  
Ragupathi M ◽  
Arthi B

Technical analysts do not attempt to measure a security's intrinsic value; instead they look at stock charts for patterns and indicators that will determine a stock's future performance. The use of past performance should come as no surprise. People using fundamental analysis have always looked at the past performance of companies by comparing fiscal data from previous quarters and years to determine future growth. The difference lies in the technical analyst's belief thatsecurities move according to very predictable trends and patterns. These trends continue until something happens to change the trend, and until this change occurs, price levels are predictable. There are many instances of investors successfully trading a security using only their knowledge of the security's chart, without even understanding what the company does. However, although technical analysis is a terrific tool, most agree it is much more effective when used in combination with fundamental analysis. The technical analysis reveals the peaks, bottoms, trends, patterns and other factors affecting a stock‘s price movement and then makes buy/sell decisions based on those factors. The tools namely RSI and EMA have been used to predict the index price movement of S&P CNX Nifty.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Shelby John Solomon ◽  
John Harrison Batcherlor

Purpose This study aims to address the efficacy debate by exploring the nature of how prior team level performance affects future performance. That is, the purpose of this study is to understand whether or not the boost of efficacy associated with success leads to overconfidence that harms performance or to motivation that enhances performance. Design/methodology/approach This study used a quantitative approach to test competing hypotheses derived from both social cognitive theory and control theory. Specifically, the study made use of archival National Football League data, containing 5,120 longitudinal team level observations. This paper uses multi-level modeling to analyze how prior team level performance affected future performance episodes. Findings The findings of this study suggest that prior success leads to overconfidence which ultimately harms future team performance. Therefore, the findings support control theory in favor of the social cognitive theory. However, this study finds that the detrimental effects of overconfidence could be offset by monitoring and work breaks. Research limitations/implications Due to the nature of the archival data source, it was not possible to directly measure efficacy. Thus, efficacy is inferred based on past performance outcomes. Practical implications This study suggests that it is important for managers and team leaders to pay careful attention to their team after successful performances. Specifically, team leaders may want to monitor their members or give them a break after successful performance episodes to avoid the negative effects of overconfidence. Originality/value This paper provides a direct test of the efficacy debate at the team level.


Sign in / Sign up

Export Citation Format

Share Document