scientific reporting
Recently Published Documents


TOTAL DOCUMENTS

65
(FIVE YEARS 16)

H-INDEX

6
(FIVE YEARS 2)

Author(s):  
Johannes Nordsteien Svensøy ◽  
Helene Nilsson ◽  
Rune Rimstad

Abstract Introduction and Objective: Scientific reporting on major incidents, mass-casualty incidents (MCIs), and disasters is challenging and made difficult by the nature of the medical response. Many obstacles might explain why there are few and primarily non-heterogenous published articles available. This study examines the process of scientific reporting through first-hand experiences from authors of published reports. It aims to identify learning points and challenges that are important to address to mitigate and improve scientific reporting after major incidents. Methods: This was a qualitative study design using semi-structured interviews. Participants were selected based on a comprehensive literature search. Ten researchers, who had published reports on major incidents, MCIs, or disasters from 2013-2018 were included, of both genders, from eight countries on three continents. The researchers reported on large fires, terrorist attacks, shootings, complex road accidents, transportation accidents, and earthquakes. Results: The interview was themed around initiation, workload, data collection, guidelines/templates, and motivation factors for reporting. The most challenging aspects of the reporting process proved to be a lack of dedicated time, difficulties concerning data collection, and structuring the report. Most researchers had no prior experience in reporting on major incidents. Guidelines and templates were often chosen based on how easily accessible and user-friendly they were. Conclusion and Relevance: There are few articles presenting first-hand experience from the process of scientific reporting on major incidents, MCIs, and disasters. This study presents motivation factors, challenges during reporting, and factors that affected the researchers’ choice of reporting tools such as guidelines and templates. This study shows that the structural tools available for gathering data and writing scientific reports need to be more widely promoted to improve systematic reporting in Emergency and Disaster Medicine. Through gathering, comparing, and analyzing data, knowledge can be acquired to strengthen and improve responses to future major incidents. This study indicates that transparency and willingness to share information are requisite for forming a successful scientific report.


2021 ◽  
Vol 69 (4) ◽  
Author(s):  
Claudia STÖLLBERGER ◽  
Josef FINSTERER ◽  
Birke SCHNEIDER
Keyword(s):  

2021 ◽  
Author(s):  
Daniel Lüdecke ◽  
Indrajeet Patil ◽  
Mattan S. Ben-Shachar ◽  
Brenton M. Wiernik ◽  
Philip Waggoner ◽  
...  

The see package is embedded in the easystats ecosystem, a collection of R packages that operate in synergy to provide a consistent and intuitive syntax when working with statistical models in the R programming language (R Core Team, 2021). Most easystats packages return comprehensive numeric summaries of model parameters and performance. The see package complements these numeric summaries with a host of functions and tools to produce a range of publication-ready visualizations for model parameters, predictions, and performance diagnostics. As a core pillar of easystats, the see package helps users to utilize visualization for more informative, communicable, and well-rounded scientific reporting.


4OR ◽  
2021 ◽  
Author(s):  
Gilbert Laporte ◽  
Paolo Toth
Keyword(s):  

PLoS ONE ◽  
2021 ◽  
Vol 16 (4) ◽  
pp. e0248753
Author(s):  
Reinie G. Gerrits ◽  
Michael J. van den Berg ◽  
Anton E. Kunst ◽  
Niek S. Klazinga ◽  
Dionne S. Kringos

Introduction Little is known about the accuracy of societal publications (e.g. press releases, internet postings or professional journals) that are based on scientific work. This study investigates a) inconsistencies between scientific peer-reviewed health services research (HSR) publications and non-scientific societal publications and b) replication of reporting inadequacies from these scientific publications to corresponding societal publications. Methods A sample of HSR publications was drawn from 116 publications authored in 2016 by thirteen Dutch HSR institutions. Societal publications corresponding to scientific publications were identified through a systematic internet search. We conducted a qualitative, directed content analysis on societal publications derived from the scientific publications to assess both reporting inadequacies and determine inconsistencies. Descriptive frequencies were calculated for all variables. Odds ratios were used to investigate whether inconsistencies in societal publications were less likely when the first scientific author was involved. Results We identified 43 scientific and 156 societal publications. 94 societal publications (60.3%), (associated with 32 scientific publications (74.4%)) contained messages that were inconsistent with the scientific work. We found reporting inadequacies in 22 scientific publications (51.2%). In 45 societal publications (28.9%), we found replications of these reporting inadequacies. The likelihood of inconsistencies between scientific and societal publications did not differ when the latter explicitly involved the first scientific author, (OR = 1.44, CI: 0.76–2.74); were published on the institute’s or funder’s website, (OR = 1.32, CI: 0.57–3.06); published with no involvement of a scientific author, (OR = 0.52, CI: 0.25–1.07). Conclusion To improve societal publications, one should examine both the consistency with scientific research publications and ways to prevent replication of scientific reporting inadequacies. HSR institutions, funders, and scientific and societal publication platforms should invest in a supportive publication culture to further incentivise the responsible and skilled involvement of researchers in writing both scientific and societal publications.


2021 ◽  
Vol 4 (4) ◽  
pp. 13-14
Author(s):  
Uriel Halbreich
Keyword(s):  

Author(s):  
Govindasamy Agoramoorthy ◽  
Minna J Hsu ◽  
Pochuen Shieh
Keyword(s):  

Author(s):  
Leslie McIntosh

While technology advances, the applications of those technologies within the scientific publishing ecosystem have lagged. There has never been a greater time to increase the speed and accuracy of scientific reporting. Researchers are under immense pressure to conduct rigorous science, and the publishing industry continues to act as a facilitator. Yet, inefficiencies stall the speed and prohibit the consistency of communicating research. This chapter proposes automating quality checks as a means to scale science. The author also explores the publishing process and potential places to use machine learning and natural language processing to enhance the quality—and thus rigor—of reporting scientific research.


Author(s):  
Claudia Stöllberger ◽  
Josef Finsterer ◽  
Birke Schneider
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document