Participant recruitment and data collection through Facebook: the role of personality factors

2017 ◽  
Author(s):  
Sean Chandler Rife ◽  
Kelly L. Cate ◽  
Michal Kosinski ◽  
David Stillwell

As participant recruitment and data collection over the Internet have become more common, numerous observers have expressed concern regarding the validity of research conducted in this fashion. One growing method of conducting research over the Internet involves recruiting participants and administering questionnaires over Facebook, the world’s largest social networking service. If Facebook is to be considered a viable platform for social research, it is necessary to demonstrate that Facebook users are sufficiently heterogeneous and that research conducted through Facebook is likely to produce results that can be generalized to a larger population. The present study examines these questions by comparing demographic and personality data collected over Facebook with data collected through a standalone website, and data collected from college undergraduates at two universities. Results indicate that statistically significant differences exist between Facebook data and the comparison data-sets, but since 80% of analyses exhibited partial η2 < .05, such differences are small or practically nonsignificant in magnitude. We conclude that Facebook is a viable research platform, and that recruiting Facebook users for research purposes is a promising avenue that offers numerous advantages over traditional samples.

2016 ◽  
Vol 14 (2) ◽  
pp. 139-151 ◽  
Author(s):  
Sara Mannheimer ◽  
Scott W.H. Young ◽  
Doralyn Rossmann

Purpose In this paper, faculty librarians at an academic institution explore the ethical dimensions of conducting research with user-generated social networking service (SNS) data. In an effort to guide librarian-researchers, this paper first offers a background discussion of privacy ethics across disciplines and then proposes a library-specific ethical framework for conducting SNS research. Design/methodology/approach By surveying the literature in other disciplines, three key considerations are identified that can inform ethical practice in the field of library science: context, expectation, and value analysis. For each of these considerations, the framework is tailored to consider ethical issues, as they relate to libraries and our practice as librarian-researchers. Findings The unique role of the librarian-researcher demands an ethical framework specific to that practice. The findings of this paper propose such a framework. Practical implications Librarian-researchers are at a unique point in our history. In exploring SNSs as a source of data to conduct research and improve services, we become challenged by conflicting and equally cherished values of patron privacy and information access. By evaluating research according to context, expectations, and value, this framework provides an ethical path forward for research using SNS data. Originality/value As of this paper’s publication, there is no existing ethical framework for conducting SNS research in libraries. The proposed framework is informed both by library values and by broader research values, and therefore provides unique guidelines for the librarian-researcher.


2021 ◽  
Vol 16 (1) ◽  
pp. 147-161
Author(s):  
Chandra Mandal Pratap

Abstract Companies cannot take decisions without the availability of proper information. So, marketers need to have the latest information about the target market. Marketers achieve this by collecting information based on traditional data collection methods and online marketing research. The study discusses the various aspects of conducting online marketing research, role of the internet in online marketing research, strategies followed by companies for online marketing research, the ways in which companies communicate and act on the information generated from the research, and the advantages and the disadvantages of conducting online marketing research. The study suggests that companies should conduct online marketing research not only to collect information about customers but also to utilize the insights generated to formulate better strategies and to develop better relationships with customers.


Author(s):  
Valerio Onida

The intervention starts from the observation that technology is a tool for the transformation of reality that is itself “neutral”, i.e. usable for different and even opposite purposes, while law is a tool to orient, to condition and to govern human behaviours in relation to social ends, that is to say, what is right or better for society. Hence the intrinsic “finalism” of law, and the fundamental difference between the “power” of technology and the juridical power that is exercised for social purposes and to settle conflicts of interests between individuals and between communities. It then examines the potential and limitations of the use of computer techniques to perform legal acts; the new role of law in the face of the growth of technological “powers” of data collection and use; the demand to adapt the regulation of relationships between individuals (such as labour relationships) in the face of technological changes in reality; the problems of “relocation” of the law related to the development of the Internet; the new demands to legal powers to regulate phenomena such as the genetic manipulation of the human being and the use of artificial intelligence, with a view to safeguarding the essence of being “human”.


2022 ◽  
pp. 88-111
Author(s):  
Sergio Mauceri ◽  
Maria Paola Faggiano ◽  
Luca Di Censi

The authors reconstruct the system of advantages and limits of e-mail data collection and web survey technique in social research; for this purpose, they examine in detail a set of studies that stimulate multiple reflections, both with reference to the overall value of survey research and on the role of the web for social sciences. The subject of all selected research designs is a complex social problem that involves the internet, both focus for observation and tool for research: voting intentions, social effects of the pandemic, the quality of university life, technology addiction. In each research experience, for different reasons—above all due to the lack of a single, self-sufficient data collection mode—, the authors favor the integration of research strategies: 1) mixed-modes of data collection, 2) follow-up panel web survey, 3) mixed methods research, 4) introduction of a preliminary pilot study, 5) multilevel survey.


2005 ◽  
Vol 10 (1) ◽  
pp. 115-130 ◽  
Author(s):  
Cha Yeow Siah

AbstractThe speed, ease and cost of conducting an internet-based study has attracted an increasingly large number of researchers to the medium for data collection. The lure of conducting research on the internet warrants heightened awareness of the practical problems one may encounter in the course of design and data collection. Researchers should also be attuned to the various threats of reliability and validity that may affect the quality of their data. This article surveys the past literature and identifies four main areas of concern in internet-based research: (1) sampling error and generalizability; (2) subject fraud; (3) measurement errors resulting from extraneous factors, and (4) the ethics of conducting research on the internet. Before carrying out their research on the internet, researchers should carefully weigh the sometimes hidden costs against the obvious benefits to consider whether the results obtained will be seriously compromised by the problems currently existing with this relatively new medium. However, a more productive approach recognizes that this research method is here to stay and thus greater attention needs to be given to refining and clearing the hurdles that internet-based researchers currently face.


AI & Society ◽  
2020 ◽  
Author(s):  
Nicolas Malevé

Abstract Computer vision aims to produce an understanding of digital image’s content and the generation or transformation of images through software. Today, a significant amount of computer vision algorithms rely on techniques of machine learning which require large amounts of data assembled in collections, or named data sets. To build these data sets a large population of precarious workers label and classify photographs around the clock at high speed. For computers to learn how to see, a scale articulates macro and micro dimensions: the millions of images culled from the internet with the few milliseconds given to the workers to perform a task for which they are paid a few cents. This paper engages in details with the production of this scale and the labour it relies on: its elaboration. This elaboration does not only require hands and retinas, it also crucially zes mobilises the photographic apparatus. To understand the specific character of the scale created by computer vision scientists, the paper compares it with a previous enterprise of scaling, Malraux’s Le Musée Imaginaire, where photography was used as a device to undo the boundaries of the museum’s collection and open it to an unlimited access to the world’s visual production. Drawing on Douglas Crimp’s argument that the “musée imaginaire”, a hyperbole of the museum, relied simultaneously on the active role of the photographic apparatus for its existence and on its negation, the paper identifies a similar problem in computer vision’s understanding of photography. The double dismissal of the role played by the workers and the agency of the photographic apparatus in the elaboration of computer vision foreground the inherent fragility of the edifice of machine vision and a necessary rethinking of its scale.


2004 ◽  
Vol 37 (4) ◽  
pp. 555-564 ◽  
Author(s):  
Michele Cianci ◽  
John R. Helliwell ◽  
David Moorcroft ◽  
Andrzej Olczak ◽  
James Raftery ◽  
...  

Synchrotron radiation beamlines offer automated data collection with faster and larger detectors, a choice of wavelength(s), intense beams and fine collimation. An increasing output of protein crystal structures sustains an interest in streamlining data collection protocols. Thus, more and more investigators are looking into the use of the anomalous signal from sulfur to obtain initial phase information for medium-size proteins. This type of experiment ideally requires the use of synchrotron radiation, softer X-rays and cryocooling of the sample. Here the results are reported of an investigation into locating the weak,i.e.sulfur, anomalous scatterers in lysozyme using rotating anode or synchrotron radiation data recorded at room temperature. It was indeed possible to locate the sulfur atoms from a lysozyme crystal at room temperature. Accurate selection of images during scaling was needed where radiation damage effects were detected. Most interestingly, comparisons are provided of high-redundancy data sets recorded with synchrotron radiation at λ = 2.0 and 1.488 Å, and with CuKα and MoKα radiation. Apocrustacyanin A1 was also investigated; from the results of a very high redundancy data collection using softer synchrotron X-rays and a cryo-cooled crystal, it was possible to find the sulfur atoms.


2021 ◽  
Vol 2 (2) ◽  
pp. 34-41
Author(s):  
Rizky Ariyanto

This study aims to: (1) describe and analyze the background and life history of Sultan Mahmud Badaruddin II; (2) describe and analyze the Palembang War 1819-1821; (3) describe and analyze the role of Sultan Mahmud Badaruddin II in the Palembang War 1819-1821; (4) describe and analyze the impact of the Palembang War 1819-1821. This research uses literature method. Data collection is done by collecting written data sources through literature studies in books, journals, theses and the internet. The steps in the research are as follows; heuristics, source criticism and interpretation and historiography. The results of this study indicate that: (1) Sultan Mahmud Badaruddin II is the Sultanate of the Palembang Darussalam Sultanate who is wise in carrying out his leadership; (2) The Palembang War 1819-1821 was divided into three periods, two periods in 1819 and one period in 1821; (3) the Sultan Mahmud Badaruddin II resisted the Dutch, who were far superior in weaponry and were able to win the Palembang War in 1819 twice; (4) the impact of this war, namely blocking the Sunsang estuary, abolishing the sultanate and replacing it with residency.


Author(s):  
Rizkan Zulyadi

The main issues in this paper are as follows judge's role in court to eradicate corruption according to law number 20 in 2001 (Study of Decision 16/PID.SUS.K/2011/PN.MDN). This type of research is normative or normative juridical or library legal research which can be interpreted as legal research by examining library materials and secondary materials.The nature of this study is descriptive analytical.This research will be carried out by the researcher is Medan Disrict Court, having his address at Court Road No. 8 Medan Medan City, North Sumatera Province This research will be carried out by researchers starting in December 2018 until completion Data collection techniques are used in writing this essay are through library research techniques and also through the help of electronic media, namely the internet, and the method of data analysis conducted by the author is to use a normative legal approach that examines secondary data. The result shows that the role of the judge in an effort to eradicate corruption cases in the Medan District Court contained in the Decision Study 16/PID.SUS.K/2011/PN.MDN, namely to prosecute the process of corruption cases and impose corruption penalties with imprisonment during 2 (two) years 6 (six) months and a fine of Rp. 50,000,000 (fifty million rupiahs), provided that the fine is not paid, it must be replaced with imprisonment for 2 (two) months.


Sign in / Sign up

Export Citation Format

Share Document