scholarly journals Learning the marginal value of mental effort over time

2021 ◽  
Author(s):  
A Ross Otto ◽  
Senne Braem ◽  
Massimo Silvetti ◽  
Eliana Vassena

In keeping with the view that individuals invest cognitive effort in accordance with its relative costs and benefits, reward incentives typically improve performance in tasks that require cognitive effort. At the same time, increasing effort investment may confer larger or smaller performance benefits—i.e., the marginal value of effort—depending on the situation, or context. On this view, we hypothesize that the magnitude of reward-induced effort modulations should depend critically on the marginal value of effort for the given context, and furthermore, the marginal value of effort of a context should be learned over time as a function of direct experience in the context. Using two well-characterized cognitive control tasks and simple computational models, we demonstrate that individuals appear to learn the marginal value of effort for different contexts. In a task-switching paradigm (Experiment 1), we found that participants initially exhibited reward-induced switch cost reductions across contexts—here, task switch rates—but over time learned to only increase effort in contexts with a comparatively larger marginal utility of effort. Likewise, in a Flanker task (Experiment 2), we observed a similar learning effect across contexts defined by the proportion of incongruent trials. Together, these results enrich theories of cost-benefit effort decision-making by highlighting the importance of the (learned) marginal utility of cognitive effort.

Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3099
Author(s):  
V. Javier Traver ◽  
Judith Zorío ◽  
Luis A. Leiva

Temporal salience considers how visual attention varies over time. Although visual salience has been widely studied from a spatial perspective, its temporal dimension has been mostly ignored, despite arguably being of utmost importance to understand the temporal evolution of attention on dynamic contents. To address this gap, we proposed Glimpse, a novel measure to compute temporal salience based on the observer-spatio-temporal consistency of raw gaze data. The measure is conceptually simple, training free, and provides a semantically meaningful quantification of visual attention over time. As an extension, we explored scoring algorithms to estimate temporal salience from spatial salience maps predicted with existing computational models. However, these approaches generally fall short when compared with our proposed gaze-based measure. Glimpse could serve as the basis for several downstream tasks such as segmentation or summarization of videos. Glimpse’s software and data are publicly available.


Entropy ◽  
2021 ◽  
Vol 23 (2) ◽  
pp. 228
Author(s):  
Sze-Ying Lam ◽  
Alexandre Zénon

Previous investigations concluded that the human brain’s information processing rate remains fundamentally constant, irrespective of task demands. However, their conclusion rested in analyses of simple discrete-choice tasks. The present contribution recasts the question of human information rate within the context of visuomotor tasks, which provides a more ecologically relevant arena, albeit a more complex one. We argue that, while predictable aspects of inputs can be encoded virtually free of charge, real-time information transfer should be identified with the processing of surprises. We formalise this intuition by deriving from first principles a decomposition of the total information shared by inputs and outputs into a feedforward, predictive component and a feedback, error-correcting component. We find that the information measured by the feedback component, a proxy for the brain’s information processing rate, scales with the difficulty of the task at hand, in agreement with cost-benefit models of cognitive effort.


2016 ◽  
Author(s):  
Falk Lieder ◽  
Tom Griffiths

Many contemporary accounts of human reasoning assume that the mind is equipped with multiple heuristics that could be deployed to perform a given task. This raises the question how the mind determines when to use which heuristic. To answer this question, we developed a rational model of strategy selection, based on the theory of rational metareasoning developed in the artificial intelligence literature. According to our model people learn to efficiently choose the strategy with the best cost-benefit tradeoff by learning a predictive model of each strategy’s performance. We found that our model can provide a unifying explanation for classic findings from domains ranging from decision-making to problem-solving and arithmetic by capturing the variability of people’s strategy choices, their dependence on task and context, and their development over time. Systematic model comparisons supported our theory, and four new experiments confirmed its distinctive predictions. Our findings suggest that people gradually learn to make increasingly more rational use of fallible heuristics. This perspective reconciles the two poles of the debate about human rationality by integrating heuristics and biases with learning and rationality.


2021 ◽  
Vol 15 ◽  
Author(s):  
Caitlin S. Walker ◽  
Jason A. Berard ◽  
Lisa A. S. Walker

Cognitive fatigability is an objective performance decrement that occurs over time during a task requiring sustained cognitive effort. Although cognitive fatigability is a common and debilitating symptom in multiple sclerosis (MS), there is currently no standard for its quantification. The objective of this study was to validate the Paced Auditory Serial Addition Test (PASAT) discrete and regression-based normative data for quantifying performance and cognitive fatigability in an Ontario-based sample of individuals with MS. Healthy controls and individuals with MS completed the 3″ and 2″ versions of the PASAT. PASAT performance was measured with total correct, dyad, and percent dyad scores. Cognitive fatigability scores were calculated by comparing performance on the first half (or third) of the task to the last half (or third). The results revealed that the 3″ PASAT was sufficient to detect impaired performance and cognitive fatigability in individuals with MS given the increased difficulty of the 2″ version. In addition, using halves or thirds for calculating cognitive fatigability scores were equally effective methods for detecting impairment. Finally, both the discrete and regression-based norms classified a similar proportion of individuals with MS as having impaired performance and cognitive fatigability. These newly validated discrete and regression-based PASAT norms provide a new tool for clinicians to document statistically significant cognitive fatigability in their patients.


Geografie ◽  
2014 ◽  
Vol 119 (3) ◽  
pp. 218-239 ◽  
Author(s):  
Marie Štefánková ◽  
Dušan Drbohlav

The article deals with regional and residential preferences of the Czech population. Regional and settlement preferences represent an interdisciplinary issue, which is relevant mostly to geography and sociology. In this article, the given issue is presented under the umbrella of a broader theoretical framework in the context of Czech and foreign studies. Selected important outputs of previous research activities in the field of regional and settlement preferences are discussed within this study, which enables it to draw a coherent picture of the given issues in Czechia and their developments over time. The main analysis is devoted to the current state of preferences of the Czech population. It is based on a representative survey, which was carried out in December 2010. The aim of the article is not only to make a comparison of regional and residential preferences over a period of almost 40 years, but also to juxtapose the patterns of regional preferences with real migration movements of the Czech population.


2018 ◽  
Vol 41 (3) ◽  
pp. 278-295 ◽  
Author(s):  
Stefania Mariano

Purpose The purpose of this study is to investigate how organizational knowledge interacts with artifacts and what determinants, driving processes and outcomes govern these interactions in organizational contexts. Design/methodology/approach A case study is used and data collected is from a US engineering and consulting company. Findings Findings suggested three major driving processes specifically initiating, challenging and improving and several related determinants and outcomes that governed the interaction between organizational knowledge and artifacts over time. Research limitations/implications This study has limitations related to the nature and dimension of the case selected. Practical implications This study provides a means to explain how organizations hold existing knowledge and what determinants, driving processes and outcomes govern the interactions between knowledge and artifacts to assist managerial practices and improve performance. Originality/value This paper contributes to the current debate on organizational knowledge and provides some empirical evidence of how knowledge interacts with artifacts in organizational contexts.


Author(s):  
Khaled M. Elbassioni

The authors consider databases in which each attribute takes values from a partially ordered set (poset). This allows one to model a number of interesting scenarios arising in different applications, including quantitative databases, taxonomies, and databases in which each attribute is an interval representing the duration of a certain event occurring over time. A natural problem that arises in such circumstances is the following: given a database D and a threshold value t, find all collections of “generalizations” of attributes which are “supported” by less than t transactions from D. They call such collections infrequent elements. Due to monotonicity, they can reduce the output size by considering only minimal infrequent elements. We study the complexity of finding all minimal infrequent elements for some interesting classes of posets. The authors show how this problem can be applied to mining association rules in different types of databases, and to finding “sparse regions” or “holes” in quantitative data or in databases recording the time intervals during which a re-occurring event appears over time. Their main focus will be on these applications rather than on the correctness or analysis of the given algorithms.


2020 ◽  
Vol 31 (1) ◽  
pp. 233-247
Author(s):  
Hun S Choi ◽  
William D Marslen-Wilson ◽  
Bingjiang Lyu ◽  
Billi Randall ◽  
Lorraine K Tyler

Abstract Communication through spoken language is a central human capacity, involving a wide range of complex computations that incrementally interpret each word into meaningful sentences. However, surprisingly little is known about the spatiotemporal properties of the complex neurobiological systems that support these dynamic predictive and integrative computations. Here, we focus on prediction, a core incremental processing operation guiding the interpretation of each upcoming word with respect to its preceding context. To investigate the neurobiological basis of how semantic constraints change and evolve as each word in a sentence accumulates over time, in a spoken sentence comprehension study, we analyzed the multivariate patterns of neural activity recorded by source-localized electro/magnetoencephalography (EMEG), using computational models capturing semantic constraints derived from the prior context on each upcoming word. Our results provide insights into predictive operations subserved by different regions within a bi-hemispheric system, which over time generate, refine, and evaluate constraints on each word as it is heard.


Author(s):  
Majid Molki ◽  
Avinash Deshetty ◽  
Arun Rajendran

Turbulent flow in a rectangular duct with a plate blockage attached to the lower wall was numerically solved. The computational models used were Large Eddy Simulation (LES), 2D and 3D, steady and unsteady Reynolds-averaged Navier-Stokes (2D-SRANS and 3D-URANS). The fluid was air in all computations, and the Reynolds number (Re) was 5000 and 30,000. The predictions of LES were in several ways closer to the experimental data. For the two values of Re considered in this study, the LES over-predicted the location of maximum Nusselt number (Nu) by 24.1–24.9%, while the 3D-URANS under-predicted it by 23.7–36.8%. The best prediction for the value of maximum Nu was made by LES for Re = 5000, which was 9.3% higher than the experimental value. The LES under-predicted the maximum Nu by 24.1% for Re = 30,000. In the given range of Re, the under-predictions of 2D-SRANS and 3D-URANS for the value of maximum Nu were, respectively, 15.1–16.5% and 25.9–30.1%. As to the location of flow reattachment, the best value was predicted by the 2D-SRANS, while those of LES and 3D-URANS were close.


2017 ◽  
Vol 59 (3) ◽  
Author(s):  
Tomas Karnagel ◽  
Dirk Habich

AbstractComputing hardware is constantly evolving and database systems need to adapt to ongoing hardware changes to improve performance. The current hardware trend is heterogeneity, where multiple computing units like CPUs and GPUs are used together in one system. In this paper, we summarize our efforts to use hardware heterogeneity efficiently for query processing. We discuss different approaches of execution and investigate heterogeneous placement in detail by showing, how to automatically determine operator placement decisions according to the given hardware environment and query properties.


Sign in / Sign up

Export Citation Format

Share Document