scholarly journals Antagonistic One-To-N Stochastic Duel Game

Mathematics ◽  
2020 ◽  
Vol 8 (7) ◽  
pp. 1114 ◽  
Author(s):  
Song-Kyoo (Amang) Kim

This paper is dealing with a multiple person game model under the antagonistic duel type setup. The unique multiple person duel game with the one-shooting-to-kill-all condition is analytically solved and the explicit formulas are obtained to determine the time dependent duel game model by using the first exceed theory. The model could be directly applied into real-world situations and an analogue of the theory in the paper is designed for solving the best shooting time for hitting all other players at once which optimizes the payoff function under random time conditions. It also mathematically explains to build the marketing strategies for the entry timing for both blue and red ocean markets.

Mathematics ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 825
Author(s):  
Song-Kyoo (Amang) Kim

This paper introduces an extended version of a stochastic game under the antagonistic duel-type setup. The most flexible multiple person duel game is analytically solved. Moreover, the explicit formulas are solved to determine the time-dependent duel game model using the first exceed theory in multiple game stages. Unlike conventional stochastic duel games, multiple battlefields are firstly introduced and each battlefield becomes a shooting ground of pairwise players in a multiperson game. Each player selects different targets in different game stages. An analogue of this new theory was designed to find the best shooting time within multiple battlefields. This model is fully mathematically explained and is the basis with which to apply a stochastic duel-type game in various practical applications.


2021 ◽  
Vol 11 (6) ◽  
pp. 478
Author(s):  
Ching Chang ◽  
Chien-Hao Huang ◽  
Hsiao-Jung Tseng ◽  
Fang-Chen Yang ◽  
Rong-Nan Chien

Background: Hepatic encephalopathy (HE), a neuropsychiatric complication of decompensated cirrhosis, is associated with high mortality and high risk of recurrence. Rifaximin add-on to lactulose for 3 to 6 months is recommended for the prevention of recurrent episodes of HE after the second episode. However, whether the combination for more than 6 months is superior to lactulose alone in the maintenance of HE remission is less evident. Therefore, the aim of this study is to evaluate the one-year efficacy of rifaximin add-on to lactulose for the maintenance of HE remission in Taiwan. Methods: We conducted a real-world single-center retrospective cohort study to compare the long-term efficacy of rifaximin add-on to lactulose (group R + L) versus lactulose alone (group L, control group). Furthermore, the treatment efficacy before and after rifaximin add-on to lactulose was also analyzed. The primary endpoint of our study was time to first HE recurrence (Conn score ≥ 2). All patients were followed up every three months until death, and censored at one year if still alive. Results and Conclusions: 12 patients were enrolled in group R + L. Another 31 patients were stratified into group L. Sex, comorbidity, ammonia level, and ascites grade were matched while age, HE grade, and model for end-stage liver disease (MELD) score were adjusted in the multivariable logistic regression model. Compared with group L, significant improvement in the maintenance of HE remission and decreased episodes and days of HE-related hospitalizations were demonstrated in group R + L. The serum ammonia levels were significantly lower at the 3rd and 6th month in group 1. Concerning changes before and after rifaximin add-on in group R + L, mini-mental status examination (MMSE), episodes of hospitalization, and variceal bleeding also improved at 6 and 12 months. Days of hospitalization, serum ammonia levels also improved at 6th month. Except for concern over price, no patients discontinued rifaximin due to adverse events or complications. The above results provide evidence for the one-year use of rifaximin add-on to lactulose in reducing HE recurrence and HE-related hospitalization for patients with decompensated cirrhosis.


2020 ◽  
Vol 36 (S1) ◽  
pp. 37-37
Author(s):  
Americo Cicchetti ◽  
Rossella Di Bidino ◽  
Entela Xoxi ◽  
Irene Luccarini ◽  
Alessia Brigido

IntroductionDifferent value frameworks (VFs) have been proposed in order to translate available evidence on risk-benefit profiles of new treatments into Pricing & Reimbursement (P&R) decisions. However limited evidence is available on the impact of their implementation. It's relevant to distinguish among VFs proposed by scientific societies and providers, which usually are applicable to all treatments, and VFs elaborated by regulatory agencies and health technology assessment (HTA), which focused on specific therapeutic areas. Such heterogeneity in VFs has significant implications in terms of value dimension considered and criteria adopted to define or support a price decision.MethodsA literature research was conducted to identify already proposed or adopted VF for onco-hematology treatments. Both scientific and grey literature were investigated. Then, an ad hoc data collection was conducted for multiple myeloma; breast, prostate and urothelial cancer; and Non Small Cell Lung Cancer (NSCLC) therapies. Pharmaceutical products authorized by European Medicines Agency from January 2014 till December 2019 were identified. Primary sources of data were European Public Assessment Reports and P&R decision taken by the Italian Medicines Agency (AIFA) till September 2019.ResultsThe analysis allowed to define a taxonomy to distinguish categories of VF relevant to onco-hematological treatments. We identified the “real-world” VF that emerged given past P&R decisions taken at the Italian level. Data was collected both for clinical and economical outcomes/indicators, as well as decisions taken on innovativeness of therapies. Relevant differences emerge between the real world value framework and the one that should be applied given the normative framework of the Italian Health System.ConclusionsThe value framework that emerged from the analysis addressed issues of specific aspects of onco-hematological treatments which emerged during an ad hoc analysis conducted on treatment authorized in the last 5 years. The perspective adopted to elaborate the VF was the one of an HTA agency responsible for P&R decisions at a national level. Furthermore, comparing a real-world value framework with the one based on the general criteria defined by the national legislation, our analysis allowed identification of the most critical point of the current national P&R process in terms ofsustainability of current and future therapies as advance therapies and agnostic-tumor therapies.


Author(s):  
Gregor Selinka ◽  
Raik Stolletz ◽  
Thomas I. Maindl

Many stochastic systems face a time-dependent demand. Especially in stochastic service systems, for example, in call centers, customers may leave the queue if their waiting time exceeds their personal patience. As discussed in the extant literature, it can be useful to use general distributions to model such customer patience. This paper analyzes the time-dependent performance of a multiserver queue with a nonhomogeneous Poisson arrival process with a time-dependent arrival rate, exponentially distributed processing times, and generally distributed time to abandon. Fast and accurate performance approximations are essential for decision support in such queueing systems, but the extant literature lacks appropriate methods for the setting we consider. To approximate time-dependent performance measures for small- and medium-sized systems, we develop a new stationary backlog-carryover (SBC) approach that allows for the analysis of underloaded and overloaded systems. Abandonments are considered in two steps of the algorithm: (i) in the approximation of the utilization as a reduced arrival stream and (ii) in the approximation of waiting-based performance measures with a stationary model for general abandonments. To improve the approximation quality, we discuss an adjustment to the interval lengths. We present a limit result that indicates convergence of the method for stationary parameters. The numerical study compares the approximation quality of different adjustments to the interval length. The new SBC approach is effective for instances with small numbers of time-dependent servers and gamma-distributed abandonment times with different coefficients of variation and for an empirical distribution of the abandonment times from real-world data obtained from a call center. A discrete-event simulation benchmark confirms that the SBC algorithm approximates the performance of the queueing system with abandonments very well for different parameter configurations. Summary of Contribution: The paper presents a fast and accurate numerical method to approximate the performance measures of a time‐dependent queueing system with generally distributed abandonments. The presented stationary backlog carryover approach with abandonment combines algorithmic ideas with stationary queueing models for generally distributed abandonment times. The reliability of the method is analyzed for transient systems and numerically studied with real‐world data.


Author(s):  
Daniel M. Tibaduiza ◽  
Luis Barbosa Pires ◽  
Carlos Farina

Abstract In this work, we give a quantitative answer to the question: how sudden or how adiabatic is a frequency change in a quantum harmonic oscillator (HO)? We do that by studying the time evolution of a HO which is initially in its fundamental state and whose time-dependent frequency is controlled by a parameter (denoted by ε) that can continuously tune from a totally slow process to a completely abrupt one. We extend a solution based on algebraic methods introduced recently in the literature that is very suited for numerical implementations, from the basis that diagonalizes the initial hamiltonian to the one that diagonalizes the instantaneous hamiltonian. Our results are in agreement with the adiabatic theorem and the comparison of the descriptions using the different bases together with the proper interpretation of this theorem allows us to clarify a common inaccuracy present in the literature. More importantly, we obtain a simple expression that relates squeezing to the transition rate and the initial and final frequencies, from which we calculate the adiabatic limit of the transition. Analysis of these results reveals a significant difference in squeezing production between enhancing or diminishing the frequency of a HO in a non-sudden way.


2019 ◽  
Author(s):  
Daniel Tang

Agent-based models are a powerful tool for studying the behaviour of complex systems that can be described in terms of multiple, interacting ``agents''. However, because of their inherently discrete and often highly non-linear nature, it is very difficult to reason about the relationship between the state of the model, on the one hand, and our observations of the real world on the other. In this paper we consider agents that have a discrete set of states that, at any instant, act with a probability that may depend on the environment or the state of other agents. Given this, we show how the mathematical apparatus of quantum field theory can be used to reason probabilistically about the state and dynamics the model, and describe an algorithm to update our belief in the state of the model in the light of new, real-world observations. Using a simple predator-prey model on a 2-dimensional spatial grid as an example, we demonstrate the assimilation of incomplete, noisy observations and show that this leads to an increase in the mutual information between the actual state of the observed system and the posterior distribution given the observations, when compared to a null model.


Author(s):  
Francisco Martínez Gala

This paper describes the main findings of a study performed by INSIA-UPM about the improvement of the reconstruction process of real world vehicle-pedestrian accidents using PC-Crash® software, aimed to develop a software tool for the estimation of the variability of the collision speed due to the lack of real values of some parameters required during the reconstruction task. The methodology has been based on a sensibility analysis of the factors variation. A total of 9 factors have been analyzed with the objective of identifying which ones were significant. Four of them (pedestrian height, collision angle, hood height and pedestrian-road friction coefficient) were significant and were included in a full factorial experiment with the collision speed as an additional factor in order to obtain a regression model with up to third level interactions. Two different factorial experiments with the same structure have been performed because of pedestrian gender differences. The tool has been created as a collision speed predictor based on the regression models obtained, using the 4 significant factors and the projection distance measured or estimated in the accident site. The tool has been used on the analysis of real-world reconstructed accidents occurred in the city of Madrid (Spain). The results have been adequate in most cases with less than 10% of deviation between the predicted speed and the one estimated in the reconstructions.DOI: http://dx.doi.org/10.4995/CIT2016.2016.3467


Sign in / Sign up

Export Citation Format

Share Document