An Under-Dispersed Discrete Distribution and Its Application

2021 ◽  
Vol 8 (3) ◽  
pp. 205-213
Author(s):  
Jasdev Bhatti ◽  
Mohit Kumar Kakkar

Background and Aim: With an increase in demands about reliability of industrial machines following continuous or discrete distribution, the important thing to be noticed is that in all previous researches where systems are having more than one failure no iteration technique has been studied to separate the failed unit on basis of its failure. Therefore, aim of our paper is to analyze the real industrial discrete problem following cold standby units arranged in parallel manner with newly concept of inspection procedure for failed units to inspect the exact failure and being communicator to the repairman for repairing exact failed part of unit for saving time and maintenance cost. Methods: The geometric distribution and regenerative techniques had been applied for calculating different reliability measures like mean time to system failure, availability of a system, inspection, repair and failed time of unit. Results: Graphical and analytical study had also been done to analyze the increasing/decreasing behavior of profit function w.r.t repair and failure rate. The system responded properly in fulfilling his basic needs. Conclusion: The calculated value of all reliability parameter is helpful for studying any other models following same concept under different environmental conditions. Thus, it concluded that, reliability increases/decreases with increase in repair/failure rate. Also, the evaluated results by this paper provides the better reliability testing strategies that helps to develop new techniques which leads to increase the effectiveness of system.


2021 ◽  
Vol 15 ◽  
pp. 175346662198953
Author(s):  
Chung-Shu Lee ◽  
Shih-Hong Li ◽  
Chih-Hao Chang ◽  
Fu-Tsai Chung ◽  
Li-Chung Chiu ◽  
...  

Background: Tuberculosis (TB) is a constant threat even with a worldwide active public health campaign. Diagnosis of TB pleurisy is challenging in the case of pleural effusion of unknown origin after aspiration analysis. The study was designed to demonstrate a simple image interpretation technique to differentiate TB pleurisy from non-TB pleurisy using semi-rigid pleuroscopy. Methods: The study retrospectively enrolled 117 patients who underwent semi-rigid pleuroscopy from April 2016 to August 2018 in a tertiary hospital. We analyzed the possibility of TB pleurisy using three simple pleuroscopic images via semi-rigid pleuroscopy. Results: Among 117 patients, 28 patients (23.9%) were diagnosed with TB pleurisy. Sago-like nodules/micronodules, adhesion, and discrete distribution were noted in 20 (71.4%), 20 (71.4%), and 19 (67.9%) patients with TB pleurisy, respectively. Sago-like nodules/micronodules, adhesion, and discrete distribution were noted in six (6.7%), 37 (41.6%), and no (0.0%) patients with non-TB pleurisy, respectively. The positive and negative predictive values of any two out of three pleuroscopic patterns for TB pleurisy were 100.0% and 93.7%, respectively. Conclusions: A high positive predictive value for TB pleurisy was demonstrated by the presence of any two out of the three characteristic features. Absence of all three features had an excellent negative predictive value for TB pleurisy. Our diagnostic criteria reconfirm that pleuroscopic images can be used as predictors for TB pleurisy in patients with undiagnosed pleural effusion. The reviews of this paper are available via the supplementary material section.


2020 ◽  
Vol 26 (2) ◽  
pp. 163-169
Author(s):  
Vladimir Nekrutkin

AbstractThis paper is devoted to random-bit simulation of probability densities, supported on {[0,1]}. The term “random-bit” means that the source of randomness for simulation is a sequence of symmetrical Bernoulli trials. In contrast to the pioneer paper [D. E. Knuth and A. C. Yao, The complexity of nonuniform random number generation, Algorithms and Complexity, Academic Press, New York 1976, 357–428], the proposed method demands the knowledge of the probability density under simulation, and not the values of the corresponding distribution function. The method is based on the so-called binary decomposition of the density and comes down to simulation of a special discrete distribution to get several principal bits of output, while further bits of output are produced by “flipping a coin”. The complexity of the method is studied and several examples are presented.


Author(s):  
Alessandro Barbiero ◽  
Asmerilda Hitaj

AbstractIn many management science or economic applications, it is common to represent the key uncertain inputs as continuous random variables. However, when analytic techniques fail to provide a closed-form solution to a problem or when one needs to reduce the computational load, it is often necessary to resort to some problem-specific approximation technique or approximate each given continuous probability distribution by a discrete distribution. Many discretization methods have been proposed so far; in this work, we revise the most popular techniques, highlighting their strengths and weaknesses, and empirically investigate their performance through a comparative study applied to a well-known engineering problem, formulated as a stress–strength model, with the aim of weighting up their feasibility and accuracy in recovering the value of the reliability parameter, also with reference to the number of discrete points. The results overall reward a recently introduced method as the best performer, which derives the discrete approximation as the numerical solution of a constrained non-linear optimization, preserving the first two moments of the original distribution. This method provides more accurate results than an ad-hoc first-order approximation technique. However, it is the most computationally demanding as well and the computation time can get even larger than that required by Monte Carlo approximation if the number of discrete points exceeds a certain threshold.


2021 ◽  
Vol 11 (9) ◽  
pp. 3949
Author(s):  
Jiawei Sun ◽  
Nektarios Koukourakis ◽  
Jürgen W. Czarske

Wavefront shaping through a multi-core fiber (MCF) is turning into an attractive method for endoscopic imaging and optical cell-manipulation on a chip. However, the discrete distribution and the low number of cores induce pixelated phase modulation, becoming an obstacle for delivering complex light field distributions through MCFs. We demonstrate a novel phase retrieval algorithm named Core–Gerchberg–Saxton (Core-GS) employing the captured core distribution map to retrieve tailored modulation hologram for the targeted intensity distribution at the distal far-field. Complex light fields are reconstructed through MCFs with high fidelity up to 96.2%. Closed-loop control with experimental feedback denotes the capability of the Core-GS algorithm for precise intensity manipulation of the reconstructed light field. Core-GS provides a robust way for wavefront shaping through MCFs; it facilitates the MCF becoming a vital waveguide in endoscopic and lab-on-a-chip applications.


Author(s):  
Nils Damaschke ◽  
Volker Kühn ◽  
Holger Nobach

AbstractThe prediction and correction of systematic errors in direct spectral estimation from irregularly sampled data taken from a stochastic process is investigated. Different sampling schemes are investigated, which lead to such an irregular sampling of the observed process. Both kinds of sampling schemes are considered, stochastic sampling with non-equidistant sampling intervals from a continuous distribution and, on the other hand, nominally equidistant sampling with missing individual samples yielding a discrete distribution of sampling intervals. For both distributions of sampling intervals, continuous and discrete, different sampling rules are investigated. On the one hand, purely random and independent sampling times are considered. This is given only in those cases, where the occurrence of one sample at a certain time has no influence on other samples in the sequence. This excludes any preferred delay intervals or external selection processes, which introduce correlations between the sampling instances. On the other hand, sampling schemes with interdependency and thus correlation between the individual sampling instances are investigated. This is given whenever the occurrence of one sample in any way influences further sampling instances, e.g., any recovery times after one instance, any preferences of sampling intervals including, e.g., sampling jitter or any external source with correlation influencing the validity of samples. A bias-free estimation of the spectral content of the observed random process from such irregularly sampled data is the goal of this investigation.


2021 ◽  
Author(s):  
Jialiang Zhou ◽  
Qianqian Wang ◽  
Chao Jia ◽  
Mugaanire Tendo Innocent ◽  
Weinan Pan ◽  
...  

Author(s):  
Emrah Altun ◽  
Gauss M. Cordeiro ◽  
Miroslav M. Ristić

Author(s):  
Xiaolong Xu ◽  
Zijie Fang ◽  
Lianyong Qi ◽  
Xuyun Zhang ◽  
Qiang He ◽  
...  

The Internet of Vehicles (IoV) connects vehicles, roadside units (RSUs) and other intelligent objects, enabling data sharing among them, thereby improving the efficiency of urban traffic and safety. Currently, collections of multimedia content, generated by multimedia surveillance equipment, vehicles, and so on, are transmitted to edge servers for implementation, because edge computing is a formidable paradigm for accommodating multimedia services with low-latency resource provisioning. However, the uneven or discrete distribution of the traffic flow covered by edge servers negatively affects the service performance (e.g., overload and underload) of edge servers in multimedia IoV systems. Therefore, how to accurately schedule and dynamically reserve proper numbers of resources for multimedia services in edge servers is still challenging. To address this challenge, a traffic flow prediction driven resource reservation method, called TripRes, is developed in this article. Specifically, the city map is divided into different regions, and the edge servers in a region are treated as a “big edge server” to simplify the complex distribution of edge servers. Then, future traffic flows are predicted using the deep spatiotemporal residual network (ST-ResNet), and future traffic flows are used to estimate the amount of multimedia services each region needs to offload to the edge servers. With the number of services to be offloaded in each region, their offloading destinations are determined through latency-sensitive transmission path selection. Finally, the performance of TripRes is evaluated using real-world big data with over 100M multimedia surveillance records from RSUs in Nanjing China.


Sign in / Sign up

Export Citation Format

Share Document