Monitoring information processing efficiency after stroke or head injury: A comparison of four computerised tests for use in single case experiments

1992 ◽  
Vol 2 (2) ◽  
pp. 137-149 ◽  
Author(s):  
A. Sunderland ◽  
S. H. Curry ◽  
S. Das ◽  
P. M. Enderby ◽  
C. Kinsey ◽  
...  
1998 ◽  
Vol 11 (1) ◽  
pp. 43-49 ◽  
Author(s):  
Luigi Trojano ◽  
Dario Grossi

We report on a patient affected by selective drawing disabilities. The patient could correctly reproduce and draw simple geometric figures on request, but when he tried to reproduce more complex drawings or to draw common objects he performed very poorly. To identify the cognitive impairment in this patient, we adopted two test batteries based on recent information-processing models of drawing. Results showed that the patient’s drawing disabilities were independent of visuo-perceptual and executive impairments. These findings support recent cognitive models of drawing abilities: some intermediate stages of drawing exist at which information is processed to prepare and guide motor output, and which may be selectively disrupted after discrete cerebral lesions.


2002 ◽  
Vol 19 (1) ◽  
pp. 66-87
Author(s):  
David A. Yeigh

AbstractThis study investigated the effects of perceived controllobility on information processing within the attributional model of learning (Weiner, 1985, 1986). Attributional style was used to identify trait patterns of controllability for 37 university student. Task-relevant feedback was then manipulated to test for differences in working memory function between participants with high versus low levels of trait controllobility. Trait controllability occurred differently for hi-trait and lo-trait types. Results supported the hypothesis that it exerts a moderating effect on the way task-relevant feedback is processed. This selective encoding of information appeared to involve limitations inherent to the working memory system that affect processing efficiency, marking an important consideration for the way in which information is presented during the learning process.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Alexander Schlegel ◽  
Hendrik Sebastian Birkel ◽  
Evi Hartmann

PurposeThe purpose of this study is to investigate how big data analytics capabilities (BDAC) enable the implementation of integrated business planning (IBP) – the advanced form of sales and operations planning (S&OP) – by counteracting the increasing information processing requirements.Design/methodology/approachThe research model is grounded in the organizational information processing theory (OIPT). An embedded single case study on a multinational agrochemical company with multiple geographically distinguished sub-units of analysis was conducted. Data were collected in workshops, semistructured interviews as well as direct observations and enriched by secondary data from internal company sources as well as publicly available sources.FindingsThe results show the relevancy of establishing BDAC within an organization to apply IBP by providing empirical evidence of BDA solutions in S&OP. The study highlights how BDAC increase an organization's information processing capacity and consequently enable efficient and effective S&OP. Practical guidance toward the development of tangible, human and intangible BDAC in a particular sequence is given.Originality/valueThis study is the first theoretically grounded, empirical investigation of S&OP implementation journeys under consideration of the impact of BDAC.


1999 ◽  
Vol 66 (3) ◽  
pp. 380-385 ◽  
Author(s):  
M Doder ◽  
M Jahanshahi ◽  
N Turjanski ◽  
I F Moseley ◽  
A J Lees

2012 ◽  
Vol 12 (5&6) ◽  
pp. 395-403
Author(s):  
Jan Bouda ◽  
Matej Pivoluska ◽  
Martin Plesch

The lack of perfect randomness can cause significant problems in securing communication between two parties. McInnes and Pinkas \cite{McInnesPinkas-ImpossibilityofPrivate-1991} proved that unconditionally secure encryption is impossible when the key is sampled from a weak random source. The adversary can always gain some information about the plaintext, regardless of the cryptosystem design. Most notably, the adversary can obtain full information about the plaintext if he has access to just two bits of information about the source (irrespective on length of the key). In this paper we show that for every weak random source there is a cryptosystem with a classical plaintext, a classical key, and a quantum ciphertext that bounds the adversary's probability $p$ to guess correctly the plaintext strictly under the McInnes-Pinkas bound, except for a single case, where it coincides with the bound. In addition, regardless of the source of randomness, the adversary's probability $p$ is strictly smaller than $1$ as long as there is some uncertainty in the key (Shannon/min-entropy is non-zero). These results are another demonstration that quantum information processing can solve cryptographic tasks with strictly higher security than classical information processing.


2020 ◽  
pp. 0000-0000
Author(s):  
Xin Cheng ◽  
Feiqi Huang ◽  
Dan Palmon ◽  
Cheng Yin

This study investigates whether information processing efficiency has an impact on public companies' investment efficiency. Using the adoption of XBRL as an exogenous shock that decreases information processing cost, we find that companies improve their investment efficiency after the adoption of XBRL. The effect is more pronounced for: 1) firms that have inferior external monitoring; 2) firms that operate in more uncertain information environments; and 3) firms that have less readable financial reporting. In addition, we find a learning curve in investors' understanding of XBRL over time. After splitting firms into over-investment and under-investment groups, we conclude that the XBRL mandate is more likely to curb managers' opportunistic over-investments. Our study extends the XBRL literature by providing empirical evidence on the effects of XBRL adoption from the perspective of managers.


Sign in / Sign up

Export Citation Format

Share Document