deterministic dynamic
Recently Published Documents


TOTAL DOCUMENTS

126
(FIVE YEARS 7)

H-INDEX

16
(FIVE YEARS 0)

2022 ◽  
Vol 6 (POPL) ◽  
pp. 1-29
Author(s):  
Yuanbo Li ◽  
Kris Satya ◽  
Qirun Zhang

Dyck-reachability is a fundamental formulation for program analysis, which has been widely used to capture properly-matched-parenthesis program properties such as function calls/returns and field writes/reads. Bidirected Dyck-reachability is a relaxation of Dyck-reachability on bidirected graphs where each edge u → ( i v labeled by an open parenthesis “( i ” is accompanied with an inverse edge v → ) i u labeled by the corresponding close parenthesis “) i ”, and vice versa. In practice, many client analyses such as alias analysis adopt the bidirected Dyck-reachability formulation. Bidirected Dyck-reachability admits an optimal reachability algorithm. Specifically, given a graph with n nodes and m edges, the optimal bidirected Dyck-reachability algorithm computes all-pairs reachability information in O ( m ) time. This paper focuses on the dynamic version of bidirected Dyck-reachability. In particular, we consider the problem of maintaining all-pairs Dyck-reachability information in bidirected graphs under a sequence of edge insertions and deletions. Dynamic bidirected Dyck-reachability can formulate many program analysis problems in the presence of code changes. Unfortunately, solving dynamic graph reachability problems is challenging. For example, even for maintaining transitive closure, the fastest deterministic dynamic algorithm requires O ( n 2 ) update time to achieve O (1) query time. All-pairs Dyck-reachability is a generalization of transitive closure. Despite extensive research on incremental computation, there is no algorithmic development on dynamic graph algorithms for program analysis with worst-case guarantees. Our work fills the gap and proposes the first dynamic algorithm for Dyck reachability on bidirected graphs. Our dynamic algorithms can handle each graph update ( i.e. , edge insertion and deletion) in O ( n ·α( n )) time and support any all-pairs reachability query in O (1) time, where α( n ) is the inverse Ackermann function. We have implemented and evaluated our dynamic algorithm on an alias analysis and a context-sensitive data-dependence analysis for Java. We compare our dynamic algorithms against a straightforward approach based on the O ( m )-time optimal bidirected Dyck-reachability algorithm and a recent incremental Datalog solver. Experimental results show that our algorithm achieves orders of magnitude speedup over both approaches.



SoftwareX ◽  
2021 ◽  
Vol 14 ◽  
pp. 100690
Author(s):  
Federico Miretti ◽  
Daniela Misul ◽  
Ezio Spessa


2021 ◽  
Author(s):  
Quan Dau ◽  
David Dorchies ◽  
Jean-Claude Bader

<p>Effective optimisation methods have emerged over the last few decades to deal with the management of multiple reservoirs serving multiple and often conflicting objectives. Despite the abundant literature on the subject, the practical use of these techniques in the field remains very limited because they are perceived as “black boxes” whose behaviour is difficult to understand for users and decision-makers (Pianosi et al. 2020).</p><p>Optimisation using one or more aggregated objectives can create stakeholder reluctance when they do not recognize their values and objectives in the optimization formulation, while also raising ethical concerns related to the inclusion of undesirable and/or hidden trade-offs. In contrast, an approach considering many non-aggregated objectives has the potential to bring out alternative courses of action that better reflect the diverging perspectives of stakeholders, and align better with ethical concerns (Kasprzyk et al. 2016).</p><p>To deal with this problem, we here follow the Wierzbicki's (1979) "reference objective" concept considering each single objective as a utopia point optimised separately by deterministic dynamic programming. The optimisation, taking into account given hydroclimatic conditions and a chosen set of constraints, provides yearly probabilistic upper or lower rule curves reflecting the risk of failing to achieve each of the objectives in the future (Bader 1992). In order to use these data, we have developed a graphical user interface based on an R Shiny application showing the risk probability of future failure of each objective depending on the calendar day and the current or forecasted storage state of each reservoir.</p><p>This framework is applied on the Seine catchment area in Paris, France, which includes a system of 4 large reservoirs to protect against floods and water shortages for multiple flow thresholds and multiple locations downstream from the reservoirs. Historical datasets as well as climate change projections are used to take into account the non-stationarity nature of hydroclimatic conditions. Among other applications, this example shows the utility of such a tool in order to justify the stakeholders decisions to discard minor objectives when they undermine the chances of success of major objectives in critical situations.</p><p> </p><p>References</p><p>----------</p><p>Bader, J.-C., 1992. Consignes de gestion du barrage à vocation multiple de Manantali: détermination des cotes limites à respecter dans la retenue [Multiple use management of Manantali Dam: determination of limiting storage levels]. Hydrologie Continentale 7, 3–12.</p><p>Kasprzyk, J.R., Reed, P.M., Hadka, D.M., 2016. Battling Arrow’s Paradox to Discover Robust Water Management Alternatives. Journal of Water Resources Planning and Management 142, 04015053. https://doi.org/10.1061/(ASCE)WR.1943-5452.0000572</p><p>Pianosi, F., Dobson, B., Wagener, T., 2020. Use of Reservoir Operation Optimization Methods in Practice: Insights from a Survey of Water Resource Managers. Journal of Water Resources Planning and Management 146, 02520005. https://doi.org/10.1061/(ASCE)WR.1943-5452.0001301</p><p>Wierzbicki, A.P., 1979. The Use of Reference Objectives in Multiobjective Optimization - Theoretical Implications and Practical Experience (No. WP-79-66). International Institute for Applied Systems Analysis, Laxenburg, Austria.</p>



2021 ◽  
Vol 9 (2) ◽  
pp. 192
Author(s):  
Candela Maceiras ◽  
José M. Pérez-Canosa ◽  
Diego Vergara ◽  
José A. Orosa

The present paper shows an original study of more than 163 ship accidents in Spain showing which of the usually employed variables are related to each type of vessel accident due to the lack of information in this region. To this end, research was carried out based on the Spanish Commission for Investigation of Maritime Accidents and Incidents (CIAIM) reports. Detailed combinatory ANOVA analysis and Bayesian networks results showed a good agreement with studies of other regions but with some particularities per each type of accident analyzed. In particular, ship length was defined as the more relevant variable at the time to differentiate types of accidents. At the same time, both the year of build and the fact that the ship meets the minimum crew members required were excellent variables to model ship accidents. Despite this, the particularities of the Spanish Search and Rescue (SAR) region were defined at the time to identify accidents. In this sense, although variables like visibility and sea conditions were employed in different previous studies as variables related to accidents occurrences, they were the worst variables to define accidents for this region. Finally, different models to relate variables were obtained being the base of deterministic dynamic analysis. Furthermore, to improve the accuracy of the developed work some indications were obtained; revision of CIAIM accidents scales, identification of redundant variables, and the need for an agreement at the time to define the classification limits of each identification variable.





Author(s):  
Felix Mora-Camino ◽  
Elena Capitanul Conea ◽  
Fabio Krykhtine ◽  
Walid Moudani ◽  
Carlos Alberto Nunes Cosenza

This chapter considers the use of fuzzy dual numbers to model and solve through dynamic programming process mathematical programming problems where uncertainty is present in the parameters of the objective function or of the associated constraints. It is only supposed that the values of the uncertain parameters remain in known real intervals and can be modelled with fuzzy dual numbers. The interest of adopting the fuzzy dual formalism to implement the sequential decision-making process of dynamic programming is discussed and compared with early fuzzy dynamic programming. Here, the comparison between two alternatives is made considering not only the cumulative performance but also the cumulative risk associated with previous steps in the dynamic process, displaying the traceability of the solution under construction as it is effectively the case with the classical deterministic dynamic programming process. The proposed approach is illustrated in the case of a long-term airport investment planning problem.



2021 ◽  
pp. 243-257
Author(s):  
Sascha Müller ◽  
Adeline Jordon ◽  
Andreas Gerndt ◽  
Thomas Noll




Gut ◽  
2020 ◽  
pp. gutjnl-2020-321744
Author(s):  
Mathieu Castry ◽  
Anthony Cousien ◽  
Virginie Supervie ◽  
Annie Velter ◽  
Jade Ghosn ◽  
...  

ObjectiveSince the early 2000s, there has been an epidemic of HCV occurring among men who have sex with men (MSM) living with HIV, mainly associated with high-risk sexual and drug-related behaviours. Early HCV diagnosis and treatment, and behavioural risk-reduction, may be effective to eliminate HCV among MSM living with HIV.DesignWe developed a deterministic dynamic compartmental model to simulate the impact of test-and-treat and risk-reduction strategies on HCV epidemic (particularly on incidence and prevalence) among MSM living with HIV in France. We accounted for HIV and HCV cascades of care, HCV natural history and heterogeneity in HCV risk behaviours. The model was calibrated to primary HCV incidence observed between 2014 and 2017 among MSM living with HIV in care (ANRS CO4-French hospital database on HIV (FHDH)).ResultsWith current French practices (annual HCV screening and immediate treatment), total HCV incidence would fall by 70%, from 0.82/100 person-years in 2015 to 0.24/100 person-years in 2030. It would decrease to 0.19/100 person-years in 2030 with more frequent screening and to 0.19 (0.12)/100 person-years in 2030 with a 20% (50%) risk-reduction. When combining screening every 3 months with a 50% risk-reduction, HCV incidence would be 0.11/100 person-years in 2030, allowing to get close to the WHO target (90% reduction from 2015 to 2030). Similarly, HCV prevalence would decrease from 2.79% in 2015 to 0.48% in 2030 (vs 0.71% with current practices).ConclusionCombining test-and-treat and risk-reduction strategies could have a marked impact on the HCV epidemic, paving the way to HCV elimination among MSM living with HIV.



2020 ◽  
Vol 158 (4) ◽  
pp. 313-325 ◽  
Author(s):  
A. E. Fleming ◽  
D. Dalley ◽  
R. H. Bryant ◽  
G. R. Edwards ◽  
P. Gregorini

AbstractFeeding fodder beet (FB) to dairy cows in early lactation has recently been adopted by New Zealand dairy producers despite limited definition of feeding and grazing management practices that may prevent acute and sub-acute ruminal acidosis (SARA). This modelling study aimed to characterize changes of rumen pH, milk production and total discomfort from FB and define practical feeding strategies of a mixed herbage and FB diet. The deterministic, dynamic and mechanistic model MINDY was used to compare a factorial arrangement of FB allowance (FBA), herbage allowance (HA) and time of allocation. The FBA were 0, 2, 4 or 7 kg dry matter (DM)/cow/day (0FB, 2FB, 4FB and 7FB, respectively) and HA were 18, 24 or 48 kg DM/cow/day above ground. All combinations were offered either in the morning or afternoon or split across two equal meals. Milk production from 2FB diets was similar to 0FB but declined by 4 and 16% when FB increased to 4 and 7 kg DM, respectively. MINDY predicted that 7FB would result in SARA and that rumen conditions were sub-optimal even at moderate FBA (pH < 5.6 for 160 and 90 min/day, 7FB and 4FB respectively). Pareto front analysis identified the best compromise between high milk production and low total discomfort was achieved by splitting the 2FB diet into two equal meals fed each day with 48 kg DM herbage. However, due to low milk response and high risk of acidosis, it is concluded that FB is a poor supplement for lactating dairy cows.



Sign in / Sign up

Export Citation Format

Share Document