methodological issue
Recently Published Documents


TOTAL DOCUMENTS

257
(FIVE YEARS 87)

H-INDEX

19
(FIVE YEARS 4)

2022 ◽  
Vol 23 (1) ◽  
Author(s):  
James J. Yang ◽  
Xi Luo ◽  
Elisa M. Trucco ◽  
Anne Buu

Abstract Background/aim The polygenic risk score (PRS) shows promise as a potentially effective approach to summarize genetic risk for complex diseases such as alcohol use disorder that is influenced by a combination of multiple variants, each of which has a very small effect. Yet, conventional PRS methods tend to over-adjust confounding factors in the discovery sample and thus have low power to predict the phenotype in the target sample. This study aims to address this important methodological issue. Methods This study proposed a new method to construct PRS by (1) approximating the polygenic model using a few principal components selected based on eigen-correlation in the discovery data; and (2) conducting principal component projection on the target data. Secondary data analysis was conducted on two large scale databases: the Study of Addiction: Genetics and Environment (SAGE; discovery data) and the National Longitudinal Study of Adolescent to Adult Health (Add Health; target data) to compare performance of the conventional and proposed methods. Result and conclusion The results show that the proposed method has higher prediction power and can handle participants from different ancestry backgrounds. We also provide practical recommendations for setting the linkage disequilibrium (LD) and p value thresholds.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Rafael González-Val ◽  
Javier Silvestre

Abstract This paper examines the effect of the Spanish Civil War (1936–1939) shock on city shares of population applying the methodology proposed by Davis, D. R., and D. E. Weinstein. 2002. “Bones, Bombs, and Break Points: The Geography of Economic Activity.” The American Economic Review 92 (5): 1269–89. We make use of an unexploited long-term, historical dataset of populations disaggregated at the city level. Our instruments, a key methodological issue, are based on dead and wounded data collected by historians. We show that the effect of the Spanish Civil War on capital cities was temporary, and argue that the locational fundamentals theory is the principal explanation.


2021 ◽  
pp. 88-112
Author(s):  
Cathal O'Donoghue

One of the most significant determinants of the level of redistribution or the capacity to change inequality within a tax-benefit system is the structure of the taxation system. In this chapter, we add income taxation and social-insurance contributions to the analysis of social transfers in the previous chapter. The chapter describes the theoretical structure of personal income taxes and introduces the concept of joint taxation. The chapter also addresses a methodological issue common to many microsimulation models and the creation of their base datasets, namely the inversion of data from net to gross. From a validation point of view, concepts associated with using external validation sources are introduced. From a measurement point of view, measures that aim to quantify the degree of progressivity and redistribution in tax systems are described. A redistributive analysis of a theoretical tax system, and the implications of a joint taxation system, is then undertaken.


2021 ◽  
pp. 073112142110465
Author(s):  
Claude Fischer ◽  
Xavier Durham

Deciding whether Americans have become decreasingly involved in group life entails a methodological issue: Does the standard question about the associations to which respondents belong, asked for decades by the General Social Survey (GSS) and many others, miss newer and more diverse forms of group involvement? Following on Paxton and Rap, we mine a recent panel survey, UCNets, that provides several different means for allowing respondents to describe their group involvement. We observe more and much more varied kinds of group involvement than those elicited by the last GSS administration of the standard question in 2004. (Analyses in the Supplement of a few additional surveys confirm this diversity.) These results lead to suggestions for how to better measure involvement in groups, in particular being more sensitive to many axes of difference in the general population. The results have implications for the larger debate as well.


Author(s):  
Petro M. Rabinovych ◽  
Serhii P. Rabinovych ◽  
Oleh Z. Pankevych

The relevance of the study is conditioned upon the pluralisation of the ideological, philosophical, and methodological foundations of legal science and attempts to theoretically overcome the competition of “positivist” and “natural” approaches to understanding law as part of an integrative legal understanding taking place against the background of such pluralisation. The purpose of the study is to identify the epistemological difficulties in constructing integral concepts of legal understanding, suggest solutions for them, and justify the option of integrative understanding of law based on a combination of dialectical and need-based methodological approaches. Main research methods. Based on dialectical logic, the essence of integrative legal understanding is covered as an attempt to synthesise contradictory approaches to understanding law, the process of integrating legal understanding is interpreted as removing contradictions in the development of legal phenomena, and integration appears as including individual moments of such development in the dynamic integrity. Based on the need-based approach, the study justifies the criterion for understanding certain phenomena as legal. Importance of the present study. It is proved that the integration of different legal understanding is a task that can be performed based on dialectical rather than formal logic, meanwhile preserving differences and contradictions between the combined conceptual elements. The study proves that during upon satisfying the needs, the properties of certain phenomena are integrated into human existence, acquiring the status of vital, and therefore normatively significant components of such existence. Therefore, the rule of law becomes the result of activity-practical integration of the phenomena serving as necessary components of human life in society


Author(s):  
K. Karimov ◽  
G. Karimov

The work is devoted to methodological problems of basic computer training of bachelors with a focus on the use of unconventional methods and forms of education. Unconventional methods and forms of basic computer training of future bachelors are formed under the influence of a number of factors, among which today the main ones are the introduction of a competence approach in higher education and the transition to a mixed (fulltime-distance) form of education. In the process of basic computer training, the information and computer and technological competences of bachelors of technical specialties can be formed on the basis of the use of traditional methods and forms of training. For the formation of procedural and activity competence, unconventional approaches are proposed, based on the use of quasi-professional tasks and the orientation to the formation of such general competencies as the ability to abstract thinking, analysis and synthesis in future bachelors; the ability to make informed decisions; ability to apply knowledge in practical situations, etc. The corresponding structure of the main discipline of basic computer training with the allocation of two modules, the main topics of each module and their approximate volumes is proposed. To ensure readiness for the educational process in emergency situations, it is advisable to use the adaptive full-distance learning scheme AFDLS. Regarding the basic computer training of higher education student, the main methodological issue is the determination of rational proportions of the combination of classroom and distance work in the context of certain types of educational work, establishing criteria and procedures for the transition from one form of education to another. To ensure the possibility of promptly changing the calendar plan and eliminating the need for independent study of complex material, it is proposed in the working programs of computer disciplines to allocate blocks of as small a volume as possible that are little connected with each other.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0255695
Author(s):  
Volker Krutsch ◽  
Werner Krutsch ◽  
Jonas Härtl ◽  
Hendrik Bloch ◽  
Volker Alt ◽  
...  

Background Video analysis is one of the most commonly applied methods for analysing football injuries. Purpose The objective of this study was to assess the accuracy of video analysis for recording head injuries in professional football from official matches in the four highest men’s professional football leagues in Germany. Methods In this cohort study, head injuries detected by means of video analysis of all official matches over one season (2017–18) were compared to head injuries registered with the German statutory accident insurance. Results Our video analysis yielded 359 head injuries of 287 players. The comparison of head injuries found in our video analysis to those registered with the accident insurance only yielded a match in 23.1% (n = 83), which presents a rather low verification rate. The verification rates varied between the leagues (7.0–30.8%). All injuries documented in the accident insurance registry were found in the video analysis (100%). The types of head injury most often verified by the accident insurance registry (n = 83) were contusion (43.4%), bone fractures (19.3%) and skin lacerations (18.1%). Only 66 of the 359 head injuries (18.4%) resulted in absence from at least one training session and involved a mean time loss of 18.5 days (1–87 days). Conclusion The mismatch between the number of head injuries found in the video analysis and head injuries registered with the accident insurance is an important methodological issue in scientific research. The low verification rate seems to be due to the unclear correlation between injury severity and clinical consequences of head injuries detected by means of video analysis and the failure of football clubs to register minor head injuries with the accident insurance.


2021 ◽  
Vol 12 ◽  
Author(s):  
Cameron Mura ◽  
Saskia Preissner ◽  
Robert Preissner ◽  
Philip E. Bourne

This Perspective examines a recent surge of information regarding the potential benefits of acid-suppression drugs in the context of COVID-19, with a particular eye on the great variability (and, thus, confusion) that has arisen across the reported findings, at least as regards the popular antacid famotidine. The degree of inconsistency and discordance reflects contradictory conclusions from independent, clinical-based studies that took roughly similar approaches, in terms of both experimental design (retrospective, observational, cohort-based, etc.) and statistical analysis workflows (propensity-score matching and stratification into sub-cohorts, etc.). The contradictions and potential confusion have ramifications for clinicians faced with choosing therapeutically optimal courses of intervention: e.g., do any potential benefits of famotidine suggest its use in a particular COVID-19 case? (If so, what administration route, dosage regimen, duration, etc. are likely optimal?) As succinctly put this March in Freedberg et al. (2021), “…several retrospective studies show relationships between famotidine and outcomes in COVID-19 and several do not.” Beyond the pressing issue of possible therapeutic indications, the conflicting data and conclusions related to famotidine must be resolved before its inclusion/integration in ontological and knowledge graph (KG)–based frameworks, which in turn are useful for drug discovery and repurposing. As a broader methodological issue, note that reconciling inconsistencies would bolster the validity of meta-analyses which draw upon the relevant data-sources. And, perhaps most broadly, developing a system for treating inconsistencies would stand to improve the qualities of both 1) real world evidence-based studies (retrospective), on the one hand, and 2) placebo-controlled, randomized multi-center clinical trials (prospective), on the other hand. In other words, a systematic approach to reconciling the two types of studies would inherently improve the quality and utility of each type of study individually.


2021 ◽  
Vol 46 (3) ◽  
pp. 902-909
Author(s):  
John H. Arnold

AbstractThis engagement with Christopher Tomlins’s In the Matter of Nat Turner (2020) focuses on a key methodological issue faced by the author, namely how one reads and positions the “authentic voice” of a past subaltern subject, known to us only through a hostile written source. This challenge is well-known to social historians of the European middle ages, and this essay suggests various ways in which Tomlins’s monograph contributes to existing debate, regarding both method and how one culturally situates and interprets the voice(s) thus identified, particularly with regard to the politics of apocalypticism.


Sign in / Sign up

Export Citation Format

Share Document