Popular running magazines and running shoe companies suggest that imprints of the bottom of the feet (plantar shape) can be used as an indication of the height of the medial longitudinal foot arch and that this can be used to select individually appropriate types of running shoes. This study examined whether or not this selection technique influenced injury risk during United States Army Basic Combat Training (BCT). After foot examinations, BCT recruits in an experimental group (E: n = 1,079 men and 451 women) selected motion control, stability, or cushioned shoes for plantar shapes judged to represent low, medium, or high foot arches, respectively. A control group (C: n = 1,068 men and 464 women) received a stability shoe regardless of plantar shape. Injuries during BCT were determined from outpatient medical records. Other previously known injury risk factors (e.g., age, fitness, and smoking) were obtained from a questionnaire and existing databases. Multivariate Cox regression controlling for other injury risk factors showed little difference in injury risk between the E and C groups among men (risk ratio (E/C) = 1.01; 95% confidence interval = 0.88-1.16; p = 0.87) or women (risk ratio (E/C) = 1.07; 95% confidence interval = 0.91-1.25; p = 0.44). In practical application, this prospective study demonstrated that selecting shoes based on plantar shape had little influence on injury risk in BCT. Thus, if the goal is injury prevention, this selection technique is not necessary in BCT.
OBJECTIVES:We set out to review the efficacy of Community Health Worker (CHW) interventions to improve glycemia in people with diabetes. METHODS: Data sources included the Cochrane Central Register of Controlled Trials, Medline, clinicaltrials.gov, Google Scholar, and reference lists of previous publications. We reviewed randomized controlled trials (RCTs) that assessed the efficacy of CHW interventions, as compared to usual care, to lower hemoglobin A1c (A1c). Two investigators independently reviewed the RCTs and assessed their quality. Only RCTs with a follow-up of at least 12 months were meta-analyzed. A random effects model was used to estimate, from unadjusted withingroup mean reductions, the standardized mean difference (SMD) in A1c achieved by the CHW intervention, beyond usual care. RESULTS: Thirteen RCTs were included in the narrative review, and nine of them, which had at least 12 months of follow-up, were included in the meta-analysis. Publication bias could not be ruled-out due to the small number of trials. Outcome heterogeneity was moderate (I 2 = 37 %). The SMD in A1c (95 % confidence interval) was 0.21 (0.11-0.32). Meta-regression showed an association between higher baseline A1c and a larger effect size. CONCLUSIONS: CHW interventions showed a modest reduction in A1c compared to usual care. A1c reduction was larger in studies with higher mean baseline A1c. Caution is warranted, given the small number of studies.
Three systematic reviews were conducted on: (i) the history of mouthguard use in sports; (ii) mouthguard material and construction; and (iii) the effectiveness of mouthguards in preventing orofacial injuries and concussions. Retrieval databases and bibliographies were explored to find studies using specific key words for each topic. The first recorded use of mouthguards was by boxers, and in the 1920s professional boxing became the first sport to require mouthguards. Advocacy by the American Dental Association led to the mandating of mouthguards for US high school football in the 1962 season. Currently, the US National Collegiate Athletic Association requires mouthguards for four sports (ice hockey, lacrosse, field hockey and football). However, the American Dental Association recommends the use of mouthguards in 29 sports/exercise activities. Mouthguard properties measured in various studies included shock-absorbing capability, hardness, stiffness (indicative of protective capability), tensile strength, tear strength (indicative of durability) and water absorption. Materials used for mouthguards included: (i) polyvinylacetate-polyethylene or ethylene vinyl acetate (EVA) copolymer; (ii) polyvinylchloride; (iii) latex rubber; (iv) acrylic resin; and (v) polyurethane. Latex rubber was a popular material used in early mouthguards but it has lower shock absorbency, lower hardness and less tear and tensile strength than EVA or polyurethane. Among the more modern materials, none seems to stand out as superior to another since the characteristics of all the modern materials can be manipulated to provide a range of favourable characteristics. Impact studies have shown that compared with no mouthguard, mouthguards composed of many types of materials reduce the number of fractured teeth and head acceleration. In mouthguard design, consideration must be given to the nature of the collision (hard or soft objects) and characteristics of the mouth (e.g. brittle incisors, more rugged occusal surfaces of molars, soft gingiva). Laminates with different shock absorbing and stress distributing (stiffness) capability may be one way to accommodate these factors.Studies comparing mouthguard users with nonusers have examined different sports, employed a variety of study designs and used widely-varying injury case definitions. Prior to the 1980s, most studies exhibited relatively low methodological quality. Despite these issues, meta-analyses indicated that the risk of an orofacial sports injury was 1.6-1.9 times higher when a mouthguard was not worn. However, the evidence that mouthguards protect against concussion was inconsistent, and no conclusion regarding the effectiveness of mouthguards in preventing concussion can be drawn at present. Mouthguards should continue to be used in sport activities where there is significant risk of orofacial injury.
Nine months deployment to Afghanistan negatively affected aerobic capacity, upper body power, and body composition. The predeployment to postdeployment changes were not large and unlikely to present a major health or fitness concern. If deployments continue to be extended and time between deployments decreased, the effects may be magnified and further study warranted.
This article defines physical fitness and then reviews the literature on temporal trends in the physical fitness of new US Army recruits. Nineteen papers were found that met the review criteria and had published recruit fitness data from 1975 to 2003. The limited data on recruit muscle strength suggested an increase from 1978 to 1998 (20-year period). Data on push-ups and sit-ups suggested no change in muscular endurance between 1984 and 2003 (19-year period). Limited data suggested that maximal oxygen uptake (VO2max) [mL/kg/min] of male recruits did not change from 1975 to 1998 (23-year period), while there was some indication of a small increase in female recruit VO2max in the same period. On the other hand, slower times on 1-mile (1.6km) and 2-mile (3.2km) runs indicate declines in aerobic performance from 1987 to 2003 (16-year period). The apparent discrepancy between the VO2max and endurance running data may indicate that recruits are not as proficient at applying their aerobic capability to performance tasks, such as timed runs, possibly because of factors such as increased bodyweight, reduced experience with running, lower motivation and/or environmental factors. Recruit height, weight and body mass index have progressively increased between 1978 and 2003 (25-year period). Both the body fat and fat-free mass of male recruits increased from 1978 to 1998 (20-year period); however, body composition data on female recruits did not show a consistent trend. In this same time period, the literature contained little data on youth physical activity but there was some suggestion that caloric consumption increased. This article indicates that temporal trends in recruit fitness differ depending on the fitness component measured. The very limited comparable data on civilian populations showed trends similar to the recruit data.
Recruits arriving for basic combat training (BCT) between October 1999 and May 2004 were administered an entry-level physical fitness test at the reception station. If they failed the test, then they entered the Fitness Assessment Program (FAP), where they physically trained until they passed the test and subsequently entered BCT. The effectiveness of the FAP was evaluated by examining fitness, injury, and training outcomes. Recruits who failed the test, trained in the FAP, and entered BCT after passing the test were designated the preconditioning (PC) group (64 men and 94 women). Recruits who failed the test but were allowed to enter BCT without going into the FAP were called the no preconditioning (NPC) group (32 men and 73 women). Recruits who passed the test and directly entered BCT were designated the no need of preconditioning (NNPC) group (1,078 men and 731 women). Army Physical Fitness Test (APFT) scores and training outcomes were obtained from a company-level database, and injured recruits were identified from cases documented in medical records. The proportions of NPC, PC, and NNPC recruits who completed the 9-week BCT cycle were 59%, 83%, and 87% for men (p < 0.01) and 52%, 69%, and 78% for women (p < 0.01), respectively. Because of attrition, only 63% of the NPC group took the week 7 APFT, compared with 84% and 86% of the PC and NNPC groups, respectively. The proportions of NPC, PC, and NNPC recruits who passed the final APFT after all retakes were 88%, 92%, and 98% for men (p < 0.01) and 89%, 92%, and 97% for women (p < 0.01), respectively. Compared with NNPC men, injury risk was 1.5 (95% confidence interval, 1.0-2.2) and 1.7 (95% confidence interval, 1.0-3.1) times higher for PC and NPC men, respectively. Compared with NNPC women, injury risk was 1.2 (95% confidence interval, 0.9-1.6) and 1.5 (95% confidence interval, 1.1-2.1) times higher for PC and NPC women, respectively. This program evaluation showed that low-fit recruits who preconditioned before BCT had reduced attrition and tended to have lower injury risk, compared with recruits of similar low fitness who did not precondition.
This paper reviews the rationale and evaluations of Physical Readiness Training (PRT), the new U.S. Army physical training doctrine designed to improve soldiers' physical capability for military operations. The purposes of PRT are to improve physical fitness, prevent injuries, progressively train soldiers, and develop soldiers' self-confidence and discipline. The PRT follows the principles of progressive overload, regularity, specificity, precision, variety, and balance. Specificity was achieved by examining the standard list of military (warrior) tasks and determining 1) the physical requirements, 2) the fitness components involved, and 3) the training activities that most likely could improve the military tasks. Injury-prevention features include reduced running mileage, exercise variety (cross-training), and gradual, progressive training. In 3 military field studies, the overall adjusted risk of injury was 1.5-1.8 times higher in groups of soldiers performing traditional military physical training programs when compared with groups using a PRT program. Scores on the Army Physical Fitness Test were similar or higher in groups using PRT programs. In an 8-week laboratory study comparing PRT with a weightlifting/running program, both programs resulted in major improvements in militarily relevant tasks (e.g., 3.2-km walk/run with 32-kg load, 400-m run with 18-kg load, 5- to 30-second rushes to and from prone position, 80-kg casualty drag, obstacle course). When compared with traditional military physical training programs, PRT consistently resulted in fewer injuries and in equal or greater improvements in fitness and military task performance.
Electronic pedometers were used to quantify locomotor physical activity during an entire 9-week United States Army Basic Combat Training (BCT) cycle. Pedometers were worn on the hips of 4 trainees in each of 10 BCT companies during all BCT activities. Investigators obtained pedometer readings (steps) on a daily basis, and estimated travel distances were obtained by multiplying steps by the average individual step length. A short questionnaire was administered daily to assure trainees wore the pedometers and trained with their companies all day. Trainees performed an average +/- SD of 16 311 +/- 5826 steps/day and traveled an estimated 11.7 +/- 4.4 kilometers/day. The highest daily locomotor activity was during the field training exercise in which trainees took an average +/- SD of 22 372 +/- 12 517 steps/day traveling an estimated 16.2 +/- 9.7 kilometers/day. Differences among the 10 companies ranged from 14 720 +/- 6649 steps/day to 18 729 +/- 6328 steps/day. This survey provided the first examination of locomotor physical activity during an entire BCT cycle.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.