Assessment of the prediction error in a large-scale application of a dynamic soil acidification model

2002 ◽  
Vol 16 (4) ◽  
pp. 279-306 ◽  
Author(s):  
J. Kros ◽  
J. P. Mol-Dijkstra ◽  
E. J. Pebesma
2021 ◽  
Author(s):  
Rachit Dubey ◽  
Mark K Ho ◽  
Hermish Mehta ◽  
Tom Griffiths

Psychologists have long been fascinated with understanding the nature of Aha! moments, moments when we transition from not knowing to suddenly realizing the solution to a problem. In this work, we present a theoretical framework that explains when and why we experience Aha! moments. Our theory posits that during problem-solving, in addition to solving the problem, people also maintain a meta-cognitive model of their ability to solve the problem as well as a prediction about the time it would take them to solve that problem. Aha! moments arise when we experience a positive error in this meta-cognitive prediction, i.e. when we solve a problem much faster than we expected to solve it. We posit that this meta-cognitive error is analogous to a positive reward prediction error thereby explaining why we feel so good after an Aha! moment. A large-scale pre-registered experiment on anagram solving supports this theory, showing that people's time prediction errors are strongly correlated with their ratings of an Aha! experience while solving anagrams. A second experiment provides further evidence to our theory by demonstrating a causal link between time prediction errors and the Aha! experience. These results highlight the importance of meta-cognitive prediction errors and deepen our understanding of human meta-reasoning.


2021 ◽  
Vol 18 (6) ◽  
pp. 825-833
Author(s):  
Qinghan Wang ◽  
Yang Liu ◽  
Cai Liu ◽  
Zhisheng Zheng

Abstract Deconvolution mainly improves the resolution of seismic data by compressing seismic wavelets, which is of great significance in high-resolution processing of seismic data. Prediction-error filtering/least-square inverse filtering is widely used in seismic deconvolution and usually assumes that seismic data is stationary. Affected by factors such as earth filtering, actual seismic wavelets are time- and space-varying. Adaptive prediction-error filters are designed to effectively characterise the nonstationarity of seismic data by using iterative methods, however, it leads to problems such as slow calculation speed and high memory cost when dealing with large-scale data. We have proposed an adaptive deconvolution method based on a streaming prediction-error filter. Instead of using slow iterations, mathematical underdetermined problems with the new local smoothness constraints are analytically solved to predict time-varying seismic wavelets. To avoid the discontinuity of deconvolution results along the space axis, both time and space constraints are used to implement multichannel adaptive deconvolution. Meanwhile, we define the parameter of the time-varying prediction step that keeps the relative amplitude relationship among different reflections. The new deconvolution improves the resolution along the time direction while reducing the computational costs by a streaming computation, which is suitable for handling nonstationary large-scale data. Synthetic model and field data tests show that the proposed method can effectively improve the resolution of nonstationary seismic data, while maintaining the lateral continuity of seismic events. Furthermore, the relative amplitude relationship of different reflections is reasonably preserved.


Neuron ◽  
2018 ◽  
Vol 100 (5) ◽  
pp. 1252-1266.e3 ◽  
Author(s):  
Zenas C. Chao ◽  
Kana Takaura ◽  
Liping Wang ◽  
Naotaka Fujii ◽  
Stanislas Dehaene

1995 ◽  
Vol 3 (3-4) ◽  
pp. 262-276 ◽  
Author(s):  
Bernhard Ulrich

The elasticity (nutrient storage, litter decomposition, bioturbation of soil) and diversity of central European forest ecosystems has been reduced by centuries of overutilization. Since the middle of the nineteenth century, their development has been influenced by silvicultural measures, as well as by the deposition of acids and nutrients, especially nitrogen from anthropogenic sources, i.e., by a mixture of stabilizing and destabilizing external influences. During recent decades, most forest soils have been acidified by acid deposition resulting in low levels of nutrient cations and negative alkalinity in the soil solution. Widespread acute acidification of soil in the rooting zone is indicated by extremely high manganese (Mn) contents in leaves (fingerprint). Soil acidification has caused drastic losses of fine roots in subsoil, indicated by denuded structural root systems where adventitious fine root complexes exist only sporadically. Research at the organ (leaf, fine root, mycorrhiza) and cellular levels has provided much information on the effects of air pollutants and soil acidification on leaves and roots. There are considerable uncertainties, however, as to how changes in the status of leaves or roots are processed within the tree and ecosystem from one level of hierarchy to the next on an increasing spatial and time scale, and how these lead to decline symptoms like crown thinning, stand opening (as a consequence of dieback or perturbations), and changes in species composition (soil biota, ground vegetation, tree regeneration). At the tree level, nutrient imbalances (due to cation losses from soil, changes in the acid/base status of the soil, proton buffering in leaves, and N deposition), as well as disturbances in the transport system of assimilates and water, are suspected of causing the decline symptoms. Information on the filtering mechanisms at various hierarchical levels, especially in the case of a break in the hierarchy, is missing. The null hypothesis (no effects of air pollutants on forest ecosystems) can be considered to be falsified. Forest ecosystems are in transition. The current state of knowledge is not sufficient to define precisely the final state that will be reached, given continuously changing environmental conditions and human impacts. The hypothesis, however, of large-scale forest dieback in the near future is not backed by data and can be discarded.Key words: forest ecosystem, process hierarchy, air pollution, deposition, acidity, nitrogen.


Author(s):  
Tomer Lange ◽  
Joseph (Seffi) Naor ◽  
Gala Yadgar

Flash-based solid state drives (SSDs) have gained a central role in the infrastructure of large-scale datacenters, as well as in commodity servers and personal devices. The main limitation of flash media is its inability to support update-in-place: after data has been written to a physical location, it has to be erased before new data can be written to it. Moreover, SSDs support read and write operations in granularity of pages, while erasures are performed on entire blocks, which often contain hundreds of pages. When erasing a block, any valid data it stores must be rewritten to a clean location. As an SSD eventually wears out with progressing number of erasures, the efficiency of the management algorithm has a significant impact on its endurance. In this paper we first formally define the SSD management problem. We then explore this problem from an algorithmic perspective, considering it in both offline and online settings. In the offline setting, we present a near-optimal algorithm that, given any input, performs a negligible number of rewrites (relative to the input length). We also discuss the hardness of the offline problem. In the online setting, we first consider algorithms that have no prior knowledge about the input. We prove that no deterministic algorithm outperforms the greedy algorithm in this setting, and discuss the possible benefit of randomization. We then augment our model, assuming that each request for a page arrives with a prediction of the next time the page is updated. We design an online algorithm that uses such predictions, and show that its performance improves as the prediction error decreases. We also show that the performance of our algorithm is never worse than that guaranteed by the greedy algorithm, even when the prediction error is large. We complement our theoretical findings with an empirical evaluation of our algorithms, comparing them with the state-of-the-art scheme. The results confirm that our algorithms exhibit an improved performance for a wide range of input traces.


Author(s):  
Andreea Visan ◽  
Mihai Istin ◽  
Florin Pop ◽  
Valentin Cristea

The state prediction of resources in large scale distributed systems represents an important aspect for resources allocations, systems evaluation, and autonomic control. The paper presents advanced techniques for resources state prediction in Large Scale Distributed Systems, which include techniques based on bio-inspired algorithms like neural network improved with genetic algorithms. The approach adopted in this paper consists of a new fitness function, having prediction error minimization as the main scope. The proposed prediction techniques are based on monitoring data, aggregated in a history database. The experimental scenarios consider the ALICE experiment, active at the CERN institute. Compared with classical predicted algorithms based on average or random methods, the authors obtain an improved prediction error of 73%. This improvement is important for functionalities and performance of resource management systems in large scale distributed systems in the case of remote control ore advance reservation and allocation.


2021 ◽  
pp. 110-123
Author(s):  
Chris Letheby

‘Resetting the brain’ examines the hypothesis that (i) large-scale neural networks become stuck in dysfunctional configurations in pathology, and (ii) psychedelics cause therapeutic benefits by disrupting these configurations, providing an opportunity to ‘reset’ the relevant networks into a healthier state. This chapter argues that this view is correct but limited; per Chapter 5, it needs to be supplemented with an account of these networks’ cognitive functions. To this end, the chapter introduces the predictive processing (PP) theory of cognition, which views the brain as an organ for prediction error minimisation. One PP-based theory of psychedelic action claims that (i) the networks targeted by psychedelics encode high-level beliefs, and (ii) psychedelic disruption of these beliefs provides an opportunity to revise them. This is the cognitive process that corresponds to the ‘resetting’ of neural networks. The chapter concludes by proposing that the beliefs most often revised in successful psychedelic therapy are self-related beliefs.


Sign in / Sign up

Export Citation Format

Share Document