The Hypothesis-Testing Game: A Training Tool for the Graduate Interviewing Skills Course

1988 ◽  
Vol 15 (3) ◽  
pp. 139-141 ◽  
Author(s):  
Kathryn M. Rickard ◽  
Robert W. Titley

This article describes an interviewing game used in a graduate-level interviewing skills course. The goal of the game is to teach basic components of the interviewing process such as comfort with the interview, microcounseling skills, hypothesis generation, and hypothesis testing. The instructor plays the part of a client and is interviewed by two teams of students. Roles of students in the game and game procedure are explained. Favorable student ratings and positive feedback have been received.

2017 ◽  
Author(s):  
Brian A. Nosek ◽  
Charles R. Ebersole ◽  
Alexander Carl DeHaven ◽  
David Thomas Mellor

Progress in science relies on generating hypotheses with existing observations and testing hypotheses with new observations. This distinction between postdiction and prediction is appreciated conceptually, but is not respected in practice. Mistaking generation of postdictions with testing of predictions reduces the credibility of research findings. However, ordinary biases in human reasoning such as hindsight bias make it hard to avoid this mistake. An effective solution is to define the research questions and analysis plan prior to observing the research outcomes--a process called preregistration. A variety of practical strategies are available to make the best possible use of preregistration in circumstances that fall short of the ideal application, such as when the data are pre-existing. Services are now available for preregistration across all disciplines facilitating a rapid increase in the practice. Widespread adoption of preregistration will increase distinctiveness between hypothesis generation and hypothesis testing and will improve the credibility of research findings.


Author(s):  
Frédéric Vallée-Tourangeau

The Wason 2-4-6 task was embedded in a practical reasoning scenario where number sequences had well-defined utilities in the process of achieving a goal. Reasoners’ hypothesis-testing behavior was clearly goal-driven and was significantly influenced by whether the utilities favored positive or negative sequences. In the version of the scenario where generating positive sequences had greater benefits than generating negative ones, participants performed poorly at the task as measured by their ability to guess the correct rule and by the nature and number of triples tested before making an announcement. In contrast, the scenario that assigned a greater utility to the production of negative sequences fostered significantly more diligent and creative hypothesis-testing behavior, and participants were significantly more likely to discover the rule. These results suggest that the poor performance observed in Wason’s traditional 2-4-6 task reflects a hypothesis-testing process that by default assigns greater utility to the production of sequences that conform to the initial triple, and hence receive positive feedback. However, reasoners are not averse to producing negative sequences, and understand their implication, if their utility is made relevant in the process of achieving goals.


Author(s):  
P. V. Balachandran ◽  
J. M. Rondinelli

This chapter is aimed at readers interested in the topic of informatics-based approaches for accelerated materials discovery, but who are unfamiliar with the nuances of the underlying principles and various types of powerful mathematical tools that are involved in formulating structure–property relationships. In an attempt to simplify the workflow of materials informatics, we decompose the paradigm into several core subtasks: hypothesis generation, database construction, data pre-processing, mathematical modeling, model validation, and finally hypothesis testing. We discuss each task and provide illustrative case studies, which apply these methods to various functional ceramic materials.


Author(s):  
P. V. Balachandran ◽  
J. M. Rondinelli

This chapter is aimed at readers interested in the topic of informatics-based approaches for accelerated materials discovery, but who are unfamiliar with the nuances of the underlying principles and various types of powerful mathematical tools that are involved in formulating structure–property relationships. In an attempt to simplify the workflow of materials informatics, we decompose the paradigm into several core subtasks: hypothesis generation, database construction, data pre-processing, mathematical modeling, model validation, and finally hypothesis testing. We discuss each task and provide illustrative case studies, which apply these methods to various functional ceramic materials.


Author(s):  
John H. Aldrich ◽  
James E. Alt ◽  
Arthur Lupia

This article describes the National Science Foundation (NSF)'s initiative to close the gap between theory and methods. It also deals with the Empirical Implications of Theoretical Models (EITM) as currently understood as a way of thinking about causal inference in service to causal reasoning. Additionally, it explores the approach's origins and various ways in which NSF's call to EITM action has been interpreted. It makes a brief attempt to explain why the EITM approach emerged, why it is valuable, and how it is currently understood. It then contends that EITM has been interpreted in multiple ways. It emphasizes a subset of extant interpretations and, in the process, offers views about the most constructive way forward. The idea of EITM is to bring deduction and induction, hypothesis generation and hypothesis testing, close together.


2019 ◽  
Vol 147 ◽  
Author(s):  
V. K. Morton ◽  
M. K. Thomas ◽  
N. Ciampa ◽  
J. Cutler ◽  
M. Hurst ◽  
...  

AbstractInvestigations into an outbreak of foodborne disease attempt to identify the source of illness as quickly as possible. Population-based reference values for food consumption can assist in investigation by providing comparison data for hypothesis generation and also strengthening the evidence associated with a food product through hypothesis testing. In 2014–2015 a national phone survey was conducted in Canada to collect data on food consumption patterns using a 3- or 7-day recall period. The resulting food consumption values over the two recall periods were compared. The majority of food products did not show a significant difference in the consumption over 3 days and 7 days. However, comparison of reference values from the 3-day recall period to data from an investigation into a Salmonella Infantis outbreak was shown to support the conclusion that chicken was the source of the outbreak whereas the reference values from a 7-day recall did not support this finding. Reference values from multiple recall periods can assist in the hypothesis generation and hypothesis testing phase of foodborne outbreak investigations.


Author(s):  
Helena Kraemer

“As ye sow. So shall ye reap”: For almost 100 years, researchers have been taught that the be-all and end-all in data-based research is the p-value. The resulting problems have now generated concern, often from us who have long so taught researchers. We must bear a major responsibility for the present situation and must alter our teachings. Despite the fact that the Zhang and Hughes paper is titled “Beyond p-value”, the total focus remains on statistical hypothesis testing studies (HTS) and p-values(1). Instead, I would propose that there are three distinct, necessary, and important phases of research: 1) Hypothesis Generation Studies (HGS) or Exploratory Research (2-4); 2) Hypothesis Testing Studies (HTS); 3) Replication and Application of Results. Of these, HTS is undoubtedly the most important, but without HGS, HTS is often weak and wasteful, and without Replication and Application, the results of HTS are often misleading.


Sign in / Sign up

Export Citation Format

Share Document