scholarly journals Teoretyczne i metodologiczne trudności w badaniach nad konfliktami zbrojnymi w naukach społecznych. Przykład Kolumbii.

Author(s):  
Karolina Wereta

The article carries out a reflection on practical, theoretical and methodological problems faced by social science researchers investigating armed conflicts. Based on Colombia’s ongoing internal war, the paper will discuss the difficulties in data collection (including source selection and information extraction), data sharing and the limitations of exiting theoretical language. It will also exploit other related factors such as the fragmentation of sociological research and the growth of publications regarding the conflict, violence, and the peace processes initiating in Colombia. In the presented article, the author analysis both governmental and non-governmental reports and also outlines several key cultural and political issues affecting data collection and their further publication in Colombia.

Author(s):  
Ellen Winner

This book is an examination of what psychologists have discovered about how art works—what it does to us, how we experience art, how we react to it emotionally, how we judge it, and what we learn from it. The questions investigate include the following: What makes us call something art? Do we experience “real” emotions from the arts? Do aesthetic judgments have any objective truth value? Does learning to play music raise a child’s IQ? Is modern art something my kid could do? Is achieving greatness in an art form just a matter of hard work? Philosophers have grappled with these questions for centuries, and laypeople have often puzzled about them too and offered their own views. But now psychologists have begun to explore these questions empirically, and have made many fascinating discoveries using the methods of social science (interviews, experimentation, data collection, statistical analysis).


2004 ◽  
Vol 32 (1) ◽  
pp. 17-23 ◽  
Author(s):  
Karin Helweg-Larsen ◽  
Ashraf Hasan Abdel-Jabbar Al-Qadi ◽  
Jalal Al-Jabriri ◽  
Henrik Brønnum-Hansen

2011 ◽  
Vol 403-408 ◽  
pp. 1491-1494
Author(s):  
Wei Yu

Questionnaire Survey is one of the most popular methods in Social Science study, the process of which consists of three phases: data collection, analysis and representation. At present, the practical operations of the three phases are still backward, and the methods used in data analysis and statistic are still simple. This paper designed a Survey Analysis System based on data visualization, which can not only realize survey on the internet, but also brought data visualization into the phases of data analysis and representation, so as to help users to obtain and operate data visually and handily.


2016 ◽  
Vol 21 (3) ◽  
pp. 95-105
Author(s):  
Thees F Spreckelsen ◽  
Mariska Van Der Horst

Significance testing is widely used in social science research. It has long been criticised on statistical grounds and problems in the research practice. This paper is an applied researchers’ response to Gorard's (2016) ‘Damaging real lives through obstinacy: re-emphasising why significance testing is wrong’ in Sociological Research Online 21(1). He participates in this debate concluding from the issues raised that the use and teaching of significance testing should cease immediately. In that, he goes beyond a mere ban of significance testing, but claims that researchers still doing this are being unethical. We argue that his attack on applied scientists is unlikely to improve social science research and we believe he does not sufficiently prove his claims. In particular we are concerned that with a narrow focus on statistical significance, Gorard misses alternative, if not more important, explanations for the often-lamented problems in social science research. Instead, we argue that it is important to take into account the full research process, not just the step of data analysis, to get a better idea of the best evidence regarding a hypothesis.


2016 ◽  
Vol 21 (2) ◽  
pp. 136-147 ◽  
Author(s):  
James Nicholson ◽  
Sean Mccusker

This paper is a response to Gorard's article, ‘Damaging real lives through obstinacy: re-emphasising why significance testing is wrong’ in Sociological Research Online 21(1). For many years Gorard has criticised the way hypothesis tests are used in social science, but recently he has gone much further and argued that the logical basis for hypothesis testing is flawed: that hypothesis testing does not work, even when used properly. We have sympathy with the view that hypothesis testing is often carried out in social science contexts when it should not be, and that outcomes are often described in inappropriate terms, but this does not mean the theory of hypothesis testing, or its use, is flawed per se. There needs to be evidence to support such a contention. Gorard claims that: ‘Anyone knowing the problems, as described over one hundred years, who continues to teach, use or publish significance tests is acting unethically, and knowingly risking the damage that ensues.’ This is a very strong statement which impugns the integrity, not just the competence, of a large number of highly respected academics. We argue that the evidence he puts forward in this paper does not stand up to scrutiny: that the paper misrepresents what hypothesis tests claim to do, and uses a sample size which is far too small to discriminate properly a 10% difference in means in a simulation he constructs. He then claims that this simulates emotive contexts in which a 10% difference would be important to detect, implicitly misrepresenting the simulation as a reasonable model of those contexts.


2017 ◽  
Vol 23 (1) ◽  
pp. 285-285

In 2016 and 2017, Sociological Research Online published the following article and two subsequent responses: Gorard S (2016) Damaging Real Lives Through Obstinacy: Re-emphasising Why Significance Testing is Wrong. Sociological Research Online 21(1): 1–14. DOI: 10.5153/sro.3857 Nicholson J and McCusker S (2016) Damaging the Case for Improving Social Science Methodology Through Misrepresentation: Re-asserting Confidence in Hypothesis Testing as a Valid Scientific Process. Sociological Research Online 21(2): 1–12. DOI: 10.5153/sro.3985 Gorard (2017) Significance Testing is Still Wrong, and Damages Real Lives: A Brief Reply to Spreckelsen and Van Der Horst, and Nicholson and McCusker. Sociological Research Online 22(2): 1–7. DOI: 10.5153/sro.4281 An erratum has been published in the journal to clarify some corrections that had inadvertently been missed ahead of publication of the first article: Erratum to Gorard (2016) Damaging Real lives Through Obstinacy: Re-emphasising Why Significance Testing is Wrong. Sociological Research Online 21(1): 1–14. DOI: 10.1177/1360780417731066 Readers are advised to read the responses to the original article, particularly paragraph 4.7 in Nicholson and McCusker (2016) and paragraphs 3.1 and 3.2 in Gorard (2017) in light of the recently published Erratum. The journal apologises for any inconvenience or misunderstanding this may have caused.


2021 ◽  
Vol 11 (1) ◽  
pp. 19-28
Author(s):  
Lars Bo Henriksen

In this paper I investigate the problems of data collection, data analysis and the final communication of the results of our research, when doing social science that we, ourselves, are part of. Central to this are the concepts life world, language games and stories and narratives. How do we collect stories and narratives in the field, how do we construct scientific narratives that are both reliable and valid? And finally, how do we, as researchers present our newly constructed narrative to a – hopefully – interested audience? That is, how do you, as a consumer of scientific narratives, read what I have been writing? Finally, I will discuss the problem of handing over research results to the people that we are doing research with. This is all done within a framework of a pragmatic constructivist paradigm.


2021 ◽  
Author(s):  
Justin Reich

Preregistration and registered reports are two promising open science practices for increasing transparency in the scientific process. In particular, they create transparency around one of the most consequential distinctions in research design: the data analytics decisions made before data collection and post-hoc decisions made afterwards. Preregistration involves publishing a time-stamped record of a study design before data collection or analysis. Registered reports are a publishing approach that facilitates the evaluation of research without regard for the direction or magnitude of findings. In this paper, I evaluate opportunities and challenges for these open science methods, offer initial guidelines for their use, explore relevant tensions around new practices, and illustrate examples from educational psychology and social science. This paper was accepted for publication in Educational Psychologist volume 56, issue 2; scheduled for April 2021, as a part of a special issue titled, “Educational psychology in the open science era.”This preprint has been peer reviewed, but not copy edited by the journal and may differ from the final published version. The DOI of the final published version is: [insert preprint DOI number]. Once the article is published online, it will be available at the following permanent link: [insert doi link]


Sign in / Sign up

Export Citation Format

Share Document