scholarly journals Fifth international conference on social science methodology research committee on logic and methodology (RC33) International Sociological Association (ISA) Köln, 3.–6. Oktober 2000

2000 ◽  
Vol 52 (1) ◽  
pp. 189-189
KWALON ◽  
2004 ◽  
Vol 9 (3) ◽  
Author(s):  
Paul ten Have

Toen ik vernam dat de zesde International Conference on Social Science Methodology dit jaar in Amsterdam zou worden gehouden, lag het voor de hand om daaraan deel te nemen. De vorige twee, in Essex en Keulen waren me redelijk goed bevallen. En het leek me goed om ook hier te zorgen voor een zekere kwalitatieve presentie. 'Methodologie' is natuurlijk een beetje een raar vak voor kwalitatief onderzoekers. Onze werkwijze blijft toch sterk verbonden met de aard van het gebruikte materiaal, de probleemstelling, kortom de specifieke omstandigheden waarin we onderzoek doen. Ons werk is veel minder ingebed in een kader van vaste voorschriften dan het bij kwantitatieve varianten het geval is. Toch heb ik het altijd belangrijk gevonden om methodologiebijeenkomsten niet te vermijden, maar er juist aan deel te nemen om te laten zien dat ook ons onderzoek het onderwerp kan zijn van serieuze methodologische reflectie. En als dit voor kwalitatief onderzoek in het algemeen geldt, is het ook van belang voor etnomethodologie en conversatieanalyse. In Essex was er dan ook een aantal sessies aan gewijd, in Keulen was dat niet gelukt, dus heb ik het voor Amsterdam opnieuw geprobeerd.


Acta Politica ◽  
2012 ◽  
Vol 47 (4) ◽  
pp. 472-474
Author(s):  
Adrie Dassen ◽  
Kostas Gemenis

2016 ◽  
Vol 21 (2) ◽  
pp. 136-147 ◽  
Author(s):  
James Nicholson ◽  
Sean Mccusker

This paper is a response to Gorard's article, ‘Damaging real lives through obstinacy: re-emphasising why significance testing is wrong’ in Sociological Research Online 21(1). For many years Gorard has criticised the way hypothesis tests are used in social science, but recently he has gone much further and argued that the logical basis for hypothesis testing is flawed: that hypothesis testing does not work, even when used properly. We have sympathy with the view that hypothesis testing is often carried out in social science contexts when it should not be, and that outcomes are often described in inappropriate terms, but this does not mean the theory of hypothesis testing, or its use, is flawed per se. There needs to be evidence to support such a contention. Gorard claims that: ‘Anyone knowing the problems, as described over one hundred years, who continues to teach, use or publish significance tests is acting unethically, and knowingly risking the damage that ensues.’ This is a very strong statement which impugns the integrity, not just the competence, of a large number of highly respected academics. We argue that the evidence he puts forward in this paper does not stand up to scrutiny: that the paper misrepresents what hypothesis tests claim to do, and uses a sample size which is far too small to discriminate properly a 10% difference in means in a simulation he constructs. He then claims that this simulates emotive contexts in which a 10% difference would be important to detect, implicitly misrepresenting the simulation as a reasonable model of those contexts.


2017 ◽  
Vol 23 (1) ◽  
pp. 285-285

In 2016 and 2017, Sociological Research Online published the following article and two subsequent responses: Gorard S (2016) Damaging Real Lives Through Obstinacy: Re-emphasising Why Significance Testing is Wrong. Sociological Research Online 21(1): 1–14. DOI: 10.5153/sro.3857 Nicholson J and McCusker S (2016) Damaging the Case for Improving Social Science Methodology Through Misrepresentation: Re-asserting Confidence in Hypothesis Testing as a Valid Scientific Process. Sociological Research Online 21(2): 1–12. DOI: 10.5153/sro.3985 Gorard (2017) Significance Testing is Still Wrong, and Damages Real Lives: A Brief Reply to Spreckelsen and Van Der Horst, and Nicholson and McCusker. Sociological Research Online 22(2): 1–7. DOI: 10.5153/sro.4281 An erratum has been published in the journal to clarify some corrections that had inadvertently been missed ahead of publication of the first article: Erratum to Gorard (2016) Damaging Real lives Through Obstinacy: Re-emphasising Why Significance Testing is Wrong. Sociological Research Online 21(1): 1–14. DOI: 10.1177/1360780417731066 Readers are advised to read the responses to the original article, particularly paragraph 4.7 in Nicholson and McCusker (2016) and paragraphs 3.1 and 3.2 in Gorard (2017) in light of the recently published Erratum. The journal apologises for any inconvenience or misunderstanding this may have caused.


2018 ◽  
Vol 26 (3) ◽  
pp. 338-344 ◽  
Author(s):  
David A. M. Peterson

In this comment on Dion, Sumner, and Mitchell’s article “Gendered Citation Patterns across Political Science and Social Science Methodology Fields,” I explore the role of changes in the disparities of citations to work written by women over time. Breaking down their citation data by era, I find that some of the patterns in citations are the result of the legacy of disparity in the field. Citations to more recent work come closer to matching the distribution of the gender of authors of published work. Although the need for more equitable practices of citation remains, the overall patterns are not quite as bad as Dion, Sumner, and Mitchell conclude.


Sign in / Sign up

Export Citation Format

Share Document