random behaviour
Recently Published Documents


TOTAL DOCUMENTS

43
(FIVE YEARS 10)

H-INDEX

8
(FIVE YEARS 1)

Author(s):  
Ali Gezer

Delay related metrics are significant quality of service criteria for the performance evaluation of networks. Almost all delay related measurement and analysis studies take into consideration the reachable sources of Internet. However, unreachable sources might also shed light upon some problems such as worm propagation. In this study, we carry out a delay measurement study of unreachable destinations and analyse the delay dynamics of unreachable nodes. 2. Internet Control Message Protocol (ICMP) destination unreachable Internet Control Message Protocol-Destination Unreachable (ICMP T3) packets are considered for the delay measurement according to their code types which shows network un reach ability, host un reach ability, port un reach ability, etc. Measurement results show that unreachable sources exhibit totally different delay behaviour compared to reachable IP hosts. A significant part of the unreachable hosts experiences extra 3 seconds Round Trip Time (RTT) delay compared to accessible hosts mostly due to host un reach ability. It is also seen that, approximately 79% of destination un reach ability causes from host un reach ability. Obtained Hurst parameter estimation results reveal that unreachable host RTTs show lower Hurst degree compared to reachable hosts which is approximately a random behaviour. Unreachable sources exhibit totally different distributional characteristic compared to accessible ones which is best fitted with Phased Bi-Exponential distribution.


2021 ◽  
Vol 8 (10) ◽  
Author(s):  
Felix J. Nitsch ◽  
Tobias Kalenscher

Choice-consistency is considered a hallmark of rational value-based choice. However, because the cognitive apparatus supporting decision-making is imperfect, real decision-makers often show some degree of choice inconsistency. Cognitive models are necessary to complement idealized choice axioms with attention, perception and memory processes. Specifically, compelling theoretical work suggests that the (imperfect) retention of choice-relevant memories might be important for choice-consistency, but this hypothesis has not been tested directly. We used a novel multi-attribute visual choice paradigm to experimentally test the influence of memory retrieval of exemplars on choice-consistency. Our manipulation check confirmed that our retention interval manipulation successfully reduced memory representation strength. Given this, we found strong evidence against our hypothesis that choice-consistency decreases with increasing retention time. However, quality controls indicated that the choice-consistency of our participants was non-discernable from random behaviour. In addition, an exploratory analysis showed essentially no test–retest reliability of choice-consistency between two observations. Taken together, this suggests the presence of a floor effect in our data and, thus, low data quality for conclusively evaluating our hypotheses. Further exploration tentatively suggested a high difficulty of discriminating between the choice objects driving this floor effect.


2021 ◽  
Vol 18 (2) ◽  
pp. 573-584
Author(s):  
Estefanía Muñoz ◽  
Andrés Ochoa

Abstract. Solar radiation has a crucial role in photosynthesis, evapotranspiration and other biogeochemical processes. The amount of solar radiation reaching the Earth's surface is a function of astronomical geometry and atmospheric optics. While the first is deterministic, the latter has a random behaviour caused by highly variable atmospheric components such as water and aerosols. In this study, we use daily radiation data (1978–2014) from 37 FLUXNET sites distributed across the globe to inspect for climatic traits in the shape of the probability density function (PDF) of the clear-day (c) and the clearness (k) indices. The analysis was made for shortwave radiation (SW) at all sites and for photosynthetically active radiation (PAR) at 28 sites. We identified three types of PDF, unimodal with low dispersion (ULD), unimodal with high dispersion (UHD) and bimodal (B), with no difference in the PDF type between c and k at each site. Looking for regional patterns in the PDF type, we found that latitude, global climate zone and Köppen climate type have a weak relation and the Holdridge life a stronger relation with c and k PDF types. The existence and relevance of a second mode in the PDF can be explained by the frequency and meteorological mechanisms of rainy days. These results are a frame to develop solar radiation stochastic models for biogeochemical and ecohydrological modelling.


Author(s):  
E. Sofos

Abstract The sequence of prime numbers p for which a variety over ℚ has no p-adic point plays a fundamental role in arithmetic geometry. This sequence is deterministic, however, we prove that if we choose a typical variety from a family then the sequence has random behaviour. We furthermore prove that this behaviour is modelled by a random walk in Brownian motion. This has several consequences, one of them being the description of the finer properties of the distribution of the primes in this sequence via the Feynman–Kac formula.


2020 ◽  
Vol 180 ◽  
pp. 03001
Author(s):  
Petru Cardei ◽  
Raluca Sfiru ◽  
Vergil Muraru

The subject of this article is the estimation of the water erosion given by different sources during the history of over one hundred years of observations. The differences between the estimates made at the near times, or at the appreciably different times, are viewed for the approximation, of the random behaviour of the factors involved in the water erosion process, but also for the changes (apparently in time) of the intensity of the factors involved. The so-called climatic changes, characterized mainly by apparently (at the scale of human life) nonperiodic changes of meteorological factors, produce effects including on the factors involved in water erosion, which are not in the category of meteorological parameters, such as soil erodibility, but also the geometric parameters of the slopes. By default, there are effects in vegetal cover and management parameters. From this point of view, the influencing factors of mathematical models for predicting water erosion should be recalculated or periodically reviewed.


Mind ◽  
2019 ◽  
Author(s):  
Thomas Icard

Abstract When does it make sense to act randomly? A persuasive argument from Bayesian decision theory legitimizes randomization essentially only in tie-breaking situations. Rational behaviour in humans, non-human animals, and artificial agents, however, often seems indeterminate, even random. Moreover, rationales for randomized acts have been offered in a number of disciplines, including game theory, experimental design, and machine learning. A common way of accommodating some of these observations is by appeal to a decision-maker’s bounded computational resources. Making this suggestion both precise and compelling is surprisingly difficult. Toward this end, I propose two fundamental rationales for randomization, drawing upon diverse ideas and results from the wider theory of computation. The first unifies common intuitions in favour of randomization from the aforementioned disciplines. The second introduces a deep connection between randomization and memory: access to a randomizing device is provably helpful for an agent burdened with a finite memory. Aside from fit with ordinary intuitions about rational action, the two rationales also make sense of empirical observations in the biological world. Indeed, random behaviour emerges more or less where it should, according to the proposal.


Monte Carlo Simulation depends on random behaviour of events. When a variable takes values at random and becomes highly unpredictable due to its nature of randomness, the property of random numbers is made use of for predicting the future values that the variable may take. This property can be made use of for predicting share price movements, when the past share prices exhibit random behaviour, without exhibiting high fluctuations. This article explains the methodology of using Monte Carlo Simulation for predicting share price movements and explains the process with the help of an illustration taking the monthly share price data of ITC Limited for a period of 36 months, where the share prices have moved within a narrow band. Findings of the analysis show that it works well and that the method of prediction is reasonably accurate, showing only a minor deviation from the actual prices.


2019 ◽  
Vol 8 (3) ◽  
pp. 4084-4089

This paper proposed on Poisson process-based algorithm is to carry out content-level deduplication for the streaming data. Since Poisson processes are meant to do the counting of different events happening over a period of time and space, it becomes appropriate to use it for identifying duplications of data as it gets streamed based on time and space, which can allow the deduplication process to be carried out in tandem. Some of the research on deduplication has been focusing on File-level and Block-level deduplication while the focus can be brought to content-level, as data get streamed lively and becomes more dynamic. With this approach, the content-level deduplication will allow the data to be scanned intelligently and at the same time, it can save the deduplication operation time. Also, streaming data has its randomness which is innately there and by having Poisson process based deduplication it will address the random behaviour of the data transfer and can work efficiently in the dynamically connected environment. The proposed method identifies the unique data to store in the Database. Based on the experimental result, the Poisson Processbased algorithm produce 0.912 Area Under Curve (AUC) accuracy on real-world streaming data, which means that if AUC is greater than 0.8 then the performance of algorithm is pretty good. So, the machine intelligence-based deduplication model produced reliable and robust deduplication on streaming data compared to existing approaches


2019 ◽  
Author(s):  
Benjamin James Dyson ◽  
Cecile Musgrave ◽  
Cameron Rowe ◽  
Rayman Sandhur

AbstractTo examine the behavioural and neural interactions between objective and subjective performance during competitive decision-making, participants completed a Matching Pennies game where win-rates were fixed within three conditions (win > lose, win = lose, win < lose) and outcomes were predicted at each trial. Using random behaviour as the hallmark of optimal performance, we observed item (heads), contingency (win-stay, lose-shift) and combinatorial (HH, HT, TH, TT) biases across all conditions. Higher-quality behaviour represented by a reduction in combinatorial bias was observed during high win-rate exposure. In contrast, over-optimism biases were observed only in conditions where win rates were equal to, or less than, loss rates. At a group level, a neural measure of outcome evaluation (feedback-related negativity; FRN) indexed the binary distinction between positive and negative outcome. At an individual level, increased belief in successful performance accentuated FRN amplitude differences between wins and losses. Taken together, the data suggest that objective experiences of, or, subjective beliefs in, the predominance of positive outcomes are mutual attempts to self-regulate performance during competition. In this way, increased exposure to positive outcomes (real or imagined) help to weight the output of the more diligent and analytic System 2, relative to the impulsive and intuitive System 1.


2019 ◽  
Author(s):  
Benjamin James Dyson ◽  
Ben Albert Steward ◽  
Tea Meneghetti ◽  
Lewis Forder

AbstractTo understand the boundaries we set for ourselves in terms of environmental responsibility during competition, we examined a neural index of outcome valence (feedback-related negativity; FRN) in relation to earlier indices of visual attention (N1), later indices of motivational significance (P3), and, eventual behaviour. In Experiment 1 (n=36), participants either were (play) or were not (observe) responsible for action selection. In Experiment 2 (n=36), opponents additionally either could (exploitable) or could not (unexploitable) be beaten. Various failures in reinforcement learning expression were revealed including large-scale approximations of random behaviour. Against unexploitable opponents, N1 determined the extent to which negative and positive outcomes were perceived as distinct categories by FRN. Against exploitable opponents, FRN determined the extent to which P3 generated neural gain for future events. Differential activation of the N1 – FRN – P3 processing chain provides a framework for understanding the behavioural dynamism observed during competitive decision making.


Sign in / Sign up

Export Citation Format

Share Document