Standards for reporting randomized controlled trials in neurosurgery

2011 ◽  
Vol 114 (2) ◽  
pp. 280-285 ◽  
Author(s):  
Erin N. Kiehna ◽  
Robert M. Starke ◽  
Nader Pouratian ◽  
Aaron S. Dumont

Object The Consolidated Standards for Reporting of Trials (CONSORT) criteria were published in 1996 to standardize the reporting and improve the quality of clinical trials. Despite having been endorsed by major medical journals and shown to improve the quality of reported trials, neurosurgical journals have yet to formally adopt these reporting criteria. The purpose of this study is to evaluate the quality and reporting of randomized controlled trials (RCTs) in neurosurgery and the factors that may affect the quality of reported trials. Methods The authors evaluated all neurosurgical RCTs published in 2006 and 2007 in the principal neurosurgical journals (Journal of Neurosurgery; Neurosurgery; Surgical Neurology; Journal of Neurology, Neurosurgery, and Psychiatry; and Acta Neurochirurgica) and in 3 leading general medical journals (Journal of the American Medical Association, Lancet, and the New England Journal of Medicine). Randomized controlled trials that addressed operative decision making or the treatment of neurosurgical patients were included in this analysis. The RCT quality was evaluated using the Jadad score and the CONSORT checklist. Results In 2006 and 2007, 27 RCTs relevant to intracranial neurosurgery were reported. Of these trials, only 59% had a Jadad score ≥ 3. The 3 major medical journals all endorsed the CONSORT guidelines, while none of the neurosurgical journals have adopted these guidelines. Randomized controlled trials published in the 3 major medical journals had a significantly higher mean CONSORT score (mean 41, range 39–44) compared with those published in neurosurgical journals (mean 26.4, range 17–38; p < 0.0001). Jadad scores were also significantly higher for the major medical journals (mean 3.42, range 2–5) than neurosurgical journals (mean 2.45, range 1–5; p = 0.05). Conclusions Despite the growing volume of RCTs in neurosurgery, the quality of reporting of these trials remains suboptimal, especially in the neurosurgical journals. Improved awareness of the CONSORT guidelines by journal editors, reviewers, and authors of these papers could improve the methodology and reporting of RCTs in neurosurgery.

2017 ◽  
Vol 54 (2) ◽  
pp. 142-152 ◽  
Author(s):  
Joseph Hardwicke ◽  
Mohammad Nassimizadeh ◽  
Bruce Richard

Objectives Reviews of the quality of reporting of randomized controlled trials (RCTs) have recently been conducted in different surgical specialties. In this review of RCTs relating to cleft lip, cleft palate, and cleft lip and palate (CL/P), we investigate the quality of reporting against the Consolidated Standards of Reporting Trials (CONSORT) checklist. Design A systematic review of CL/P RCTs published from 2004 to 2013, with the included articles scored against the CONSORT checklist. Patients, Participants The literature search identified 174 articles. Studies were selected for participants with CL/P who were involved in an RCT with prospective data collection and reported in a full journal article. A total of 6352 participants were included from 65 CUP RCTs during the study period. Main Outcome Measures The methodological quality of RCTs was assessed using the CONSORT checklist and Jadad scale. Results The mean CONSORT score was 15.8, and the mean Jadad score was 3.3. There was a significant positive correlation between the CONSORT and Jadad score ( P < .0001, ρ = .47). The only significant correlation showed that with an increasing number of authors, both the CONSORT and the Jadad score increased. Conclusion This analysis has shown that that there are deficiencies in the transparent reporting of factors such as randomization implementation, blinding, and participant flow. Interventions, outcomes, and the interpretation of results are well presented. We would recommend that RCTs are conceived and undertaken using the CONSORT checklist and reported in a clear and reproducible manner.


2018 ◽  
Vol 25 (1) ◽  
pp. 107327481878130
Author(s):  
Huiyun Zhu ◽  
Si Chen ◽  
Pei Xie ◽  
Geliang Yang ◽  
Zhenqiang Zhong ◽  
...  

Randomized controlled trials (RCTs) are important for evidence-based medicine; however, their quality of reporting remains to be evaluated. The aim of this study was to assess the quality of the report concerning solid tumor medication. Articles were searched in PubMed to identify all oncology phase III RCTs published from 2011 to 2015, and the results were classified manually through Endnote X7.0 software. Registration rate, primary end point (PEP) consistency, positive result rate, enrollment time point, outcome feedback in the registry, and publish time zone were extracted and assessed. The overall registration rate was higher than years before; nevertheless, a portion of trials showed PEP discrepancies and enrolled patients before registration in either journal formats. Trials published in top 5 general medical journals paid more attention to results feedback on registration websites and were more prompt with publication after study accomplishment. Our data suggested general medical journals may be more rigorous compared to oncology journals but identified a preference for positive results. On the whole, RCTs published between 2011 and 2015 seemed fairly standardized. Surveillance in registry and outcome feedback still needs to be strengthened for the stringency and reliability of clinical trials in solid tumor medication territory.


2003 ◽  
Vol 93 (5) ◽  
pp. 392-398 ◽  
Author(s):  
Michael A. Turlik ◽  
Donald Kushner ◽  
Dina Stock

The purposes of this study were to develop an instrument to assess the validity of randomized controlled trials and to report on the differences in the validity of randomized controlled trials between two podiatric medical journals and a mainstream medical journal. The study demonstrated that after adequate training, there can be agreement among reviewers evaluating the quality of published randomized controlled trials using an established instrument and guidelines. The results of the study indicate that randomized controlled trials published in podiatric medical journals are less credible than those published in a mainstream medical journal. (J Am Podiatr Med Assoc 93(5): 392-398, 2003)


Surgery ◽  
2019 ◽  
Vol 165 (5) ◽  
pp. 965-969 ◽  
Author(s):  
Wenwen Chen ◽  
Jiajie Yu ◽  
Longhao Zhang ◽  
Guanyue Su ◽  
Wen Wang ◽  
...  

2013 ◽  
Vol 39 (8) ◽  
pp. 1386-1395 ◽  
Author(s):  
Nicola Latronico ◽  
Marta Metelli ◽  
Maddalena Turin ◽  
Simone Piva ◽  
Frank A. Rasulo ◽  
...  

2019 ◽  
Vol 10 (2) ◽  
pp. 79
Author(s):  
Melvin George ◽  
Luxitaa Goenka ◽  
Suramya Rajendran ◽  
Kalaiselvi Arumugam ◽  
Jamuna Rani

2018 ◽  
Vol 43 (8) ◽  
pp. 801-807 ◽  
Author(s):  
Chao Long ◽  
Heather E. desJardins-Park ◽  
Rita Popat ◽  
Paige M. Fox

We assessed the quantity, quality and trends of randomized controlled trials comparing hand surgical interventions. Study characteristics were collected for 125 randomized controlled trials comparing hand surgical interventions. The Jadad scale (0–5), which assesses methodological quality of trials, was calculated. Logistic regressions were conducted to determine associations with the Jadad score. The studies were published between 1981 and 2015, with an increase over time, most often in Journal of Hand Surgery (European). Mean study size was 68 patients. Mean Jadad score was 2.1, without improvement over time. Thirty percent conducted a power analysis and 8% an intention-to-treat analysis. Studies conducted in the United Kingdom and with smaller sample sizes, power analysis and intention-to-treat analysis were associated with a higher Jadad score. The quantity of trials has increased over time while methodological quality has remained low, indicating a need to improve quality of trials in hand surgery literature.


2019 ◽  
Vol 6 (2) ◽  
pp. 177-179
Author(s):  
Muhammad Shahzeb Khan ◽  
Rohan Kumar Ochani ◽  
Asim Shaikh ◽  
Muthiah Vaduganathan ◽  
Safi U Khan ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document