gamelike features
Recently Published Documents


TOTAL DOCUMENTS

6
(FIVE YEARS 0)

H-INDEX

3
(FIVE YEARS 0)

2017 ◽  
Author(s):  
Jim Alexander Lumsden ◽  
Andy Skinner ◽  
David Coyle ◽  
Natalia Lawrence ◽  
Marcus Robert Munafo

The prospect of assessing cognition longitudinally is attractive to researchers, health practitioners and pharmaceutical companies alike. However, such repeated-testing regimes place a considerable burden on participants, and with cognitive tasks typically being regarded as effortful and unengaging, these studies may experience high levels of participant attrition. One potential solution is to gamify these tasks to make them more engaging: increasing participant willingness to take part and reducing attrition. However, such an approach must balance task validity with introducing entertaining gamelike elements.We investigated the effects of gamelike features on participant attrition using a between-subjects, longitudinal online testing study. We used three variants of a common cognitive task, the stop signal task, with a single gamelike feature in each: one variant where points were rewarded for performing optimally, another where the task was given a graphical theme, and a third variant which was a standard stop signal task and served as a control condition. Participants completed four compulsory test sessions over four consecutive days before entering a six-day voluntary testing period where they faced a daily decision to either drop out or continue taking part. Participants were paid for each session they completed.We saw no evidence for an effect of gamification on attrition, with participants dropping out of each variant at equal rates. Our findings raise doubts about the ability of gamification to increase engagement with cognitive testing studies.


PeerJ ◽  
2016 ◽  
Vol 4 ◽  
pp. e2184 ◽  
Author(s):  
Jim Lumsden ◽  
Andy Skinner ◽  
Andy T. Woods ◽  
Natalia S. Lawrence ◽  
Marcus Munafò

Computerised cognitive assessments are a vital tool in the behavioural sciences, but participants often view them as effortful and unengaging. One potential solution is to add gamelike elements to these tasks in order to make them more intrinsically enjoyable, and some researchers have posited that a more engaging task might produce higher quality data. This assumption, however, remains largely untested. We investigated the effects of gamelike features and test location on the data and enjoyment ratings from a simple cognitive task. We tested three gamified variants of the Go-No-Go task, delivered both in the laboratory and online. In the first version of the task participants were rewarded with points for performing optimally. The second version of the task was framed as a cowboy shootout. The third version was a standard Go-No-Go task, used as a control condition. We compared reaction time, accuracy and subjective measures of enjoyment and engagement between task variants and study location. We found points to be a highly suitable game mechanic for gamified cognitive testing because they did not disrupt the validity of the data collected but increased participant enjoyment. However, we found no evidence that gamelike features could increase engagement to the point where participant performance improved. We also found that while participants enjoyed the cowboy themed task, the difficulty of categorising the gamelike stimuli adversely affected participant performance, increasing No-Go error rates by 28% compared to the non-game control. Responses collected online vs. in the laboratory had slightly longer reaction times but were otherwise very similar, supporting other findings that online crowdsourcing is an acceptable method of data collection for this type of research.


2012 ◽  
Vol 45 (2) ◽  
pp. 301-318 ◽  
Author(s):  
Guy E. Hawkins ◽  
Babette Rae ◽  
Keith V. Nesbitt ◽  
Scott D. Brown
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document