Context:Prevention of a lower extremity sprain or strain requires some basis for predicting that an individual athlete will sustain such an injury unless a modifiable risk factor is addressed.Objective:To assess the possible existence of an association between reaction time measured during completion of a computerized neurocognitive test battery and subsequent occurrence of a lower extremity sprain or strain.Design:Prospective cohort study.Setting:Preparticipation screening conducted in a computer laboratory on the day prior to initiation of preseason practice sessions.Participants:76 NCAA Division I-FCS football players.Main Outcome Measures:Lower extremity sprains and strains sustained between initiation of preseason practice sessions and the end of an 11-game season. Receiver operating characteristic analysis identified the optimal reaction time cut-point for discrimination between injured versus noninjured status. Stratified analyses were performed to evaluate any differential influence of reaction time on injury incidence between starters and nonstarters.Results:A total of 29 lower extremity sprains and strains were sustained by 23 of the 76 players. A reaction time cut-point of ≥ .545 s provided good discrimination between injured and noninjured cases: 74% sensitivity, 51% specificity, relative risk = 2.17 (90% CI: 1.10, 4.30), and odds ratio = 2.94 (90% CI: 1.19, 7.25).Conclusions:Neurocognitive reaction time appears to be an indicator of elevated risk for lower extremity sprains and strains among college football players, which may be modifiable through performance of exercises designed to accelerate neurocognitive processing of visual input.
Slow VMRT appears to be an important and modifiable injury risk factor for college football players. More research is needed to refine visuomotor reaction-time screening and training methods and to determine the extent to which improved performance values can reduce injury incidence.
Context: Poor core stability is believed to increase vulnerability to uncontrolled joint displacements throughout the kinetic chain between the foot and the lumbar spine.Objective: To assess the value of preparticipation measurements as predictors of core or lower extremity strains or sprains in collegiate football players.Design: Cohort study. Setting: National Collegiate Athletic Association Division I Football Championship Subdivision football program.Patients or Other Participants: All team members who were present for a mandatory physical examination on the day before preseason practice sessions began (n 5 83).Main Outcome Measure(s): Preparticipation administration of surveys to assess low back, knee, and ankle function; documentation of knee and ankle injury history; determination of body mass index; 4 different assessments of core muscle endurance; and measurement of step-test recovery heart rate. All injuries were documented throughout the preseason practice period and 11-game season. Receiver operating characteristic analysis and logistic regression analysis were used to identify dichotomized predictive factors that best discriminated injured from uninjured status. The 75th and 50th percentiles were evaluated as alternative cutpoints for dichotomization of injury predictors.Results: Players with $2 of 3 potentially modifiable risk factors related to core function had 2 times greater risk for injury than those with ,2 factors (95% confidence interval 5 1.27, 4.22), and adding a high level of exposure to game conditions increased the injury risk to 3 times greater (95% confidence interval 5 1.95, 4.98). Prediction models that used the 75th and 50th percentile cutpoints yielded results that were very similar to those for the model that used receiver operating characteristic-derived cutpoints.Conclusions: Low back dysfunction and suboptimal endurance of the core musculature appear to be important modifiable football injury risk factors that can be identified on preparticipation screening. These predictors need to be assessed in a prospective manner with a larger sample of collegiate football players.Key Words: clinical prediction rule, injury prevention, injury risk, core stability Key Points N Low back dysfunction and poor endurance of the core musculature appear to be modifiable injury risk factors for preventing core and lower extremity injuries in collegiate football players.N These factors can be identified by preparticipation screening and individualized core stability training regimens implemented to potentially reduce the risk of core and lower extremity injuries in this population.
Context: Excessive fat mass clearly has adverse effects on metabolic processes that can ultimately lead to the development of chronic disease. Early identification of high-risk status may facilitate referral for definitive diagnostic tests and implementation of interventions to reduce cardiometabolic risk.Objective: To document the prevalence of metabolic syndrome among collegiate football players and to develop a clinical prediction rule that does not require blood analysis to identify players who may possess a high level of cardiometabolic risk.Design: Cross-sectional cohort study. Setting: University athletic training research laboratory.Patients or Other Participants: Sixty-two National Collegiate Athletic Association Division I Football Championship Subdivision football players (age 5 19.9 6 1.2 years, height 5 182.6 6 6.1 cm, mass 5 97.4 6 18.3 kg).Main Outcome Measure(s): Anthropometric characteristics associated with body fat, isokinetic quadriceps strength, and biometric indicators associated with metabolic syndrome were measured. Participants were classified as high risk or low risk for future development of type 2 diabetes and cardiovascular disease.Results: The prevalence of metabolic syndrome in the cohort was 19% (12 of 62), and 79% (49 of 62) of the players exceeded the threshold for 1 or more of its 5 components. A 4-factor clinical prediction rule that classified individuals on the basis of waist circumference, blood pressure, quadriceps strength, and ethnic category had 92% sensitivity (95% confidence interval 5 65%, 99%) and 76% specificity (95% confidence interval 5 63%, 86%) for discrimination of high-risk or low-risk status.Conclusions: The risk for developing type 2 diabetes and cardiovascular disease appears to be exceptionally high among collegiate football players. A lack of race-specific criteria for the diagnosis of metabolic syndrome almost certainly contributes to an underestimation of the true level of cardiometabolic risk for African American collegiate football players.Key Words: metabolic syndrome, insulin resistance, abdominal fat Key Points N In this Division I football team, metabolic syndrome was found in 19% of players overall, 46% of the linemen, and 14% of the nonlinemen. The cardiometabolic risk in the African American players was almost certainly underestimated.N For identifying obesity-related health risk, waist circumference was a better discriminator than either body fat percentage or body mass index.N A quadriceps peak torque/body mass ratio of less than 2.93 (peak torque/body weight less than 0.98) was the optimal cut point for identifying players with metabolic syndrome.N Our clinical prediction rule identified 92% of players with metabolic syndrome on the basis of waist circumference, systolic or diastolic blood pressure, quadriceps peak torque/body mass ratio, and white ethnicity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.