reserve officer training corps
Recently Published Documents


TOTAL DOCUMENTS

36
(FIVE YEARS 9)

H-INDEX

6
(FIVE YEARS 0)

2022 ◽  
Author(s):  
Bryan Terlizzi ◽  
T Cade Abrams ◽  
Ryan S Sacko ◽  
Amy F Hand ◽  
Kyle Silvey ◽  
...  

ABSTRACT Introduction The development of functional motor competence (FMC; i.e., neuromuscular coordination and control required to meet a wide range of movement goals) is critical to long-term development of health- and performance-related physical capacities (e.g., muscular strength and power, muscular endurance, and aerobic endurance). Secular decline in FMC among U.S. children and adolescents presents current and future challenges for recruiting prospective military personnel to successfully perform the physical demands of military duty. The purpose of the current study was to examine the relationship between FMC and physical military readiness (PMR) in a group of Cadets enrolled in an Army Reserve Officer Training Corps program. Materials and Methods Ninety Army Reserve Officer Training Corps Cadets from a southeastern university and a military college in the southeast (females = 22; Mage = 19.5 ± 2.5) volunteered for participation in the study. Cadets performed a battery of eight FMC assessments consisting of locomotor, object projection, and functional coordination tasks. To assess PMR, Cadets performed the Army Combat Fitness Test (ACFT). Values from all FMC assessments were standardized based on the sample and summed to create a composite FMC score. ACFT scores were assigned to Cadets based upon ACFT scoring standards. We used Pearson correlations to assess the relationships between individual FMC assessment raw scores, FMC composite scores, and total ACFT points. We also evaluated the potential impact of FMC on ACFT in the entire sample and within each gender subgroup using hierarchical linear regression. Finally, we implemented a 3 × 2 chi-squared analysis to evaluate the predictive utility of FMC level on pass/fail results on the ACFT by categorizing Cadets’ composite FMC score into high (≥75th percentile) moderate (≥25th percentile and <75th percentile), and low (<25th percentile) based on the percentile ranks within the sample. ACFT pass/fail results were determined using ACFT standards, requiring a minimum of 60 points on each the ACFT subtests. Results FMC composite scores correlated strongly with total ACFT performance (r = 0.762) with individual FMC tests demonstrating weak-to-strong relationships ACFT performance (r = 0.200–0.769). FMC uniquely accounted for 15% (95% CI: −0.07 to 0.36) of the variance in ACFT scores in females (R2 = 0.516, F2,19 = 10.11, P < 0.001) and 26% (95% CI: 0.09–0.43) in males (R2 = 0.385, F2,65 = 20.37, P < 0.001), respectively, above and beyond the impact of age. The 3 × 2 chi-squared analysis demonstrated 74% of those with low, 28% with moderate, and 17% with high FMC failed the ACFT (χ2 [1, N = 90] = 27.717, V = 0.555, P < 0.001). Conclusion FMC composite scores are strongly correlated with ACFT scores, and low levels of FMC were a strong predictor of ACFT failure. These data support the hypothesis that the development of sufficient FMC in childhood and adolescence may be a critical antecedent for PMR. Efforts to improve FMC in children and adolescents may increase PMR of future military recruits.


2021 ◽  
pp. 002234332110108
Author(s):  
Andrew Bell

Can armed groups socialize combatants to norms of restraint – in essence, train soldiers to adopt norms of international humanitarian law on the battlefield? How can social scientists accurately measure such socialization? Despite being the central focus of organizational and ideational theories of conflict, studies to date have not engaged in systematic, survey-based examination of this central socialization mechanism theorized to influence military conduct. This study advances scholarly understanding by providing the first comparative, survey-based examination of combatant socialization to norms of restraint, using surveys and interviews with US Army cadets at the US Military Academy (USMA), Army Reserve Officer Training Corps (ROTC), and active duty Army combatants. Additionally, to better understand ‘restraint’ from combatants’ perspective, this study introduces the concept of the ‘combatant’s trilemma’ under which combatants conceptualize civilian protection as part of a costly trade-off with the values of military advantage and force protection. Survey results hold both positive and negative implications for socialization to law of war norms: military socialization can shift combatants’ preferences for battlefield conduct. However, intensive norm socialization may be required to shift combatants’ preferences from force protection to civilian protection norms. Study findings hold significant implications for understanding violence against civilians in conflict and for policies to disseminate civilian protection norms in armed groups worldwide.


SLEEP ◽  
2021 ◽  
Vol 44 (Supplement_2) ◽  
pp. A121-A122
Author(s):  
Kajsa Carlsson ◽  
Carolyn Mickelson ◽  
Jake Choynowski ◽  
Janna Mantua ◽  
Jaime Devine ◽  
...  

Abstract Introduction U.S. Army Reserve Officer Training Corps (ROTC) Advanced Camp (AC) is a 29-day training that assesses military skills and leadership potential in college students training to become Commissioned Officers (i.e. Cadets). Military trainings are widely known to disrupt normative sleep. Additionally, operational sleep disruption is linked to performance decrements. This study examined the ability for objective and subjective sleep during ROTC AC to predict Cadet performance. Methods One hundred and fifty-nine ROTC Cadets (age 22.06±2.49 years; 76.1% male) wore an actiwatch device continuously for 29 days during AC. Paper surveys administered at the end of AC captured subjective sleep metrics during the training. ROTC instructors evaluated Cadet performance and provided scores of overall class rank and summary performance. Multiple and ordinal linear regressions assessed the predicative utility of subjective (sleep duration [SD]; Global score [Global] from the Pittsburgh Sleep Quality Index) and objective (Total Sleep Time [TST]; Sleep Efficiency [SE]; Sleep Latency Onset [SOL]; Wake After Sleep Onset [WASO] from actigraphy) sleep on performance. Results The interaction of SD and Global, when controlling for age and gender, significantly predicted increased Cadet rank, F(4,153) = 3.09, p = 0.018. Models testing the prediction of SD and Global on summary performance score were non-significant. Further, regressing of both Cadet rank and summary performance individually on objective sleep metrics, when controlling for age and gender, resulted in non-significant findings. Conclusion Subjective and objective sleep showed no significant individual predictive utility on performance. However, the combined subjective model significantly predicted that Cadets who slept worse (lower SD; higher Global) during AC received a lower rank at the end of the training. These findings suggest there may be a unique combined predictive utility of subjective sleep on performance when compared to the predictive power of individual variables. Therefore, subjective sleep may be better for predicting operational performance than objective sleep. Future analyses will refine these models and examine how performance on individual AC events may be influenced by sleep. Support for this study came from the Military Operational Medicine Research Program (MOMRP) of the United States Army Medical Research and Development Command (USAMRDC). Support (if any):


2020 ◽  
Vol 185 (7-8) ◽  
pp. e937-e943
Author(s):  
Cathryn Draicchio ◽  
Joel R Martin ◽  
Marcie B Fyock-Martin ◽  
Justin J Merrigan

ABSTRACT Introduction Because of the physical fitness requirements of Military Occupational Specialties (MOSs) within the US Army, fitness testing batteries have been developed. The Occupational Physical Assessment Test (OPAT) has been used for determining occupation assignment and is meant to assess upper and lower body muscular power, muscular strength, and aerobic capacity. The Army Physical Fitness Test (APFT) is a general fitness assessment meant to test upper and lower body muscular endurance and aerobic capacity. Comparisons of the two testing batteries as well as evaluation of potential sex differences are missing from current literature. Therefore, the purpose of this study was to investigate the sex differences in APFT and OPAT performances, as well as the relationship between the APFT and OPAT individual test events. Materials and Methods A retrospective analysis, approved by the university’s institutional review board, was conducted on a sample of 90 Reserve Officer Training Corps (ROTC) cadets (men, n = 72, age = 19.7 ± 2.2 years, height = 1.79 ± 0.21 m, mass = 75.18 ± 12.38 kg; women, n = 18, age = 19.8 ± 2.2 years, height = 1.63 ± 0.09 m, mass = 65.56 ± 9.03 kg) from the Fall 2018 semester. The cadets completed the APFT (maximal push-ups, maximal sit-ups, 2-mile run) and OPAT (standing long jump, seated power throw, a deadlift test, and interval aerobic run) protocols per army standards. Analysis of variance assessed sex differences and correlation coefficients were computed to examine the strength of relationships between individual test events (p < 0.05). Results All APFT and OPAT event performances were lower in women compared to men (p < 0.05), except the sit-up test (p = 0.382). Seated power throw (Z = 2.285; p = 0.011), 2-mile run (Z = 1.97; p = 0.024), and strength deadlift (Z = 1.783; p = 0.037) were more highly correlated to push-up than aerobic interval run. The standing long jump (Z = 1.741; p = 0.041), power throw (Z = 3.168; p = 0.001), strength deadlift (Z = 2.989; p = 0.001), and shuttle run (Z = 2.933 p = 0.002) were less correlated with the sit-up than 2-mile run. The interval aerobic run was more related to the 2-mile run than the power throw (Z = 1.809, p = 0.035). Compared to the aerobic interval run, the standing long jump (Z = 2.969, p = 0.001) and strength deadlift (Z = 4.237, p < 0.001) were more related to the seated power throw. Conclusions Sex differences and varied relationships among individual events on two common military fitness test batteries were observed. Lower performances on APFT and OPAT by women may suggest the need to evaluate potential training methods to assist women in reaching their desired MOS. Further, individual OPAT events displayed weaker relationships compared with the relationships among individual APFT event, suggesting a greater degree of redundancy among the events on the APFT. Therefore, the combination of APFT and OPAT may offer a greater opportunity to measure physical fitness capabilities as related to various military job performance tasks.


2020 ◽  
Vol 185 (Supplement_1) ◽  
pp. 610-616
Author(s):  
John P Barrett ◽  
Irene M Rosen ◽  
Louis R Stout ◽  
Stephanie E Rosen

ABSTRACT Introduction This study evaluates a large cohort of college students after the 2009–2010 pandemic H1N1 influenza season. The objective was to assess influenza vaccination status, influenzalike illnesses (ILIs), and other characteristics associated with attaining immunizations. Methods This study was conducted during the summer 2010 the Reserve Officer Training Corps Leadership Development and Assessment Course involving 6272 college students. A voluntary, anonymous questionnaire was administered to assess study objectives. Results Vaccination rates were 39.9% for pandemic H1N1, 40.6% for seasonal influenza, and 32.6% for receiving both vaccinations. Age less than 25 and having a Reserve Officer Training Corps scholarship were associated with lower odds of receiving vaccinations, whereas entering the nursing field and simultaneous membership in the Army reserve forces were associated with higher odds of vaccination. There are 11.2% of respondents reported having an ILI, including 4.3% with severe ILI. There were 4184 reasons indicated for not attaining influenza vaccinations, which are listed in categorical groupings. Conclusions A historical anchor for vaccination rates and ILI is provided in a large cohort of college students following the 2009 H1N1 influenza pandemic. Influenza immunization locations were determined, as was self-reported obstacles to receiving vaccinations. These are important results for public health leaders seeking to increase vaccination rates during future influenza seasons.


2019 ◽  
Vol 4 (1) ◽  
pp. e000192
Author(s):  
Michael Melton ◽  
Jayanthi Kandiah

Objective: Assess the effects of varying levels and duration of dietary nitrate supplementationfrom beetroot juice (BR) on sprint performance in army ROTC cadets. Methods: Army Reserve Officer Training Corps (ROTC) cadets were randomly assigned to oneof three treatment groups: control (CON); low beetroot juice dose (BR1); and high BR juice –BR2. For 0, 6 and 15 days nitrate consumption from BR groups were as follows: CON receivedone 16.9 oz. bottle of apple juice (0 mg NO 3 -); BR1 received one can of BR juice (300 mg, 4.84mmol NO 3 -), and BR2 received 2 cans (16.8 oz.) BR (600 mg, 9.68 mmol NO 3 -). One week priorto the study, each cadet completed body composition measurements, predicted aerobic capacitymeasurements, and nutritional analysis via two 24-hour dietary recalls. Differences in primarymeasures (distance covered in the Yo-Yo IR1) were analyzed with two-way repeated measuresANOVA tests both between groups (CON, BR1, BR2) and within groups (0, 6, and day 15).Descriptive statistics and frequency counts were run on all remaining variables with a one-wayANOVA or t-test, including maximal heart rate during the YoYo IR1, dietary compliance, dailyblood pressure, juice compliance, and conditioning work-outs Results: A dose-related enhancement with BR was observed; the data trended towardssignificance even in this small sample. A t-test revealed that there was a significant difference insprint performance by males and females overall at days 0, 6, 15 (p = 0.025, p = 0.005, p =0.004, respectively). Conclusion: A single (300 mg, 4.84 mmol NO 3 ) or double (600 mg, 9.68 mmol NO 3 -) daily doseconsumption of BR appears to benefit ROTC cadets in athletic performance. Daily consumptionof BR benefitted ROTC males more than females. Results suggest BR supplementation could beadvantageous for sprint performance when administered for a longer duration (> 15 days).


Sign in / Sign up

Export Citation Format

Share Document