Background In this Position Statement, the International Society of Sports Nutrition (ISSN) provides an objective and critical review of the literature pertinent to nutritional considerations for training and racing in single-stage ultra-marathon. Recommendations for Training. i) Ultra-marathon runners should aim to meet the caloric demands of training by following an individualized and periodized strategy, comprising a varied, food-first approach; ii) Athletes should plan and implement their nutrition strategy with sufficient time to permit adaptations that enhance fat oxidative capacity; iii) The evidence overwhelmingly supports the inclusion of a moderate-to-high carbohydrate diet (i.e., ~ 60% of energy intake, 5–8 g·kg− 1·d− 1) to mitigate the negative effects of chronic, training-induced glycogen depletion; iv) Limiting carbohydrate intake before selected low-intensity sessions, and/or moderating daily carbohydrate intake, may enhance mitochondrial function and fat oxidative capacity. Nevertheless, this approach may compromise performance during high-intensity efforts; v) Protein intakes of ~ 1.6 g·kg− 1·d− 1 are necessary to maintain lean mass and support recovery from training, but amounts up to 2.5 g.kg− 1·d− 1 may be warranted during demanding training when calorie requirements are greater; Recommendations for Racing. vi) To attenuate caloric deficits, runners should aim to consume 150–400 Kcal·h− 1 (carbohydrate, 30–50 g·h− 1; protein, 5–10 g·h− 1) from a variety of calorie-dense foods. Consideration must be given to food palatability, individual tolerance, and the increased preference for savory foods in longer races; vii) Fluid volumes of 450–750 mL·h− 1 (~ 150–250 mL every 20 min) are recommended during racing. To minimize the likelihood of hyponatraemia, electrolytes (mainly sodium) may be needed in concentrations greater than that provided by most commercial products (i.e., > 575 mg·L− 1 sodium). Fluid and electrolyte requirements will be elevated when running in hot and/or humid conditions; viii) Evidence supports progressive gut-training and/or low-FODMAP diets (fermentable oligosaccharide, disaccharide, monosaccharide and polyol) to alleviate symptoms of gastrointestinal distress during racing; ix) The evidence in support of ketogenic diets and/or ketone esters to improve ultra-marathon performance is lacking, with further research warranted; x) Evidence supports the strategic use of caffeine to sustain performance in the latter stages of racing, particularly when sleep deprivation may compromise athlete safety.
IntroductionUntil recently, women were excluded from British combat roles. Their risk for musculoskeletal injury during basic training is two to three times higher than men. To better understand the musculoskeletal injury risk of women in British Army infantry basic training, we compared injury incidence between (1) men in standard entry training and men in infantry training, to assess the risk of infantry training; and (2) men and women in both standard entry and officer basic training, to assess the risk in women compared with men.MethodsThe incidence of musculoskeletal injury was determined from defence medical records for all men entering infantry training, and for all men and women entering standard entry and officer training, between April 2015 and March 2016.Results7390 men (standard entry, n=4229; infantry, n=2683; officer, n=478) and 696 women (standard entry, n=626; officer, n=70) entered basic training. Men in infantry training had a lower incidence of musculoskeletal injury (391 vs 417 per 1000 personnel, OR 0.90 (95% CI 0.81 to 0.99), p=0.028) and a higher incidence of stress fracture (14 vs 5 per 1000 personnel, OR 2.80 (95% CI 1.64 to 4.80), p<0.001) than men in standard entry training. Women had a higher incidence of musculoskeletal injury than men in standard entry training (522 vs 417 per 1000 personnel, OR 1.53 (95% CI 1.29 to 1.81), p<0.001) and a higher incidence of stress fracture than men in officer training (114 vs 19 per 1000 personnel, OR 6.72 (95% CI 2.50 to 18.07), p<0.001).ConclusionWomen in infantry training may be at similar risk for musculoskeletal injury, but at higher risk for stress fracture, compared with their non-infantry counterparts. Women in infantry training may be at higher risk for musculoskeletal injury and stress fracture compared with men in infantry training.
BackgroundBritish Army Phase One training exposes men and women to challenging distances of 13.5 km·d− 1 vs. 11.8 km·d− 1 and energy expenditures of ~ 4000 kcal·d− 1 and ~ 3000 kcal·d− 1, respectively. As such, it is essential that adequate nutrition is provided to support training demands. However, to date, there is a paucity of data on habitual dietary intake of British Army recruits. The aims of this study were to: (i) compare habitual dietary intake in British Army recruits undergoing Phase One training to Military Dietary Reference Values (MDRVs), and (ii) establish if there was a relative sex difference in dietary intake between men and women.MethodResearcher led weighed food records and food diaries were used to assess dietary intake in twenty-eight women (age 21.4 ± 3.0 yrs., height: 163.7 ± 5.0 cm, body mass 65.0 ± 6.7 kg), and seventeen men (age 20.4 ± 2.3 yrs., height: 178.0 ± 7.9 cm, body mass 74.6 ± 8.1 kg) at the Army Training Centre, Pirbright for 8-days in week ten of training. Macro and micronutrient content were estimated using dietary analysis software (Nutritics, Dublin) and assessed via an independent sample t-test to establish if there was a sex difference in daily energy, macro or micronutrient intakes.ResultsEstimated daily energy intake was less than the MDRV for both men and women, with men consuming a greater amount of energy compared with women (2846 ± 573 vs. 2207 ± 585 kcal·day− 1, p < 0.001). Both sexes under consumed carbohydrate (CHO) when data was expressed relative to body mass with men consuming a greater amount than women (4.8 ± 1.3 vs. 3.8 ± 1.4 g·kg− 1·day− 1, p = 0.025, ES = 0.74). Both sexes also failed to meet MDRVs for protein intake with men consuming more than women (1.5 ± 0.3 vs. 1.3 ± 0.3 g·kg− 1·day− 1, p > 0.030, ES = 0.67). There were no differences in dietary fat intake between men and women (1.5 ± 0.2 vs. 1.5 ± 0.5 g·kg− 1·day− 1, p = 0.483, ES = 0.00).ConclusionsDaily EI in men and women in Phase One training does not meet MDRVs. Interventions to increase macronutrient intakes should be considered along with research investigating the potential benefits for increasing different macronutrient intakes on training adaptations.
We evaluated the impact of protein supplementation on adaptations to arduous concurrent training in healthy adults with potential applications to individuals undergoing military training. Peer-reviewed papers published in English meeting the population, intervention, comparison and outcome criteria were included. Database searches were completed in PubMed, Web of science and SPORTDiscus. Study quality was evaluated using the COnsensus based standards for the selection of health status measurement instruments checklist. Of 11 studies included, nine focused on performance, six on body composition and four on muscle recovery. Cohen’s d effect sizes showed that protein supplementation improved performance outcomes in response to concurrent training (ES = 0.89, 95% CI = 0.08–1.70). When analysed separately, improvements in muscle strength (SMD = +4.92 kg, 95% CI = −2.70–12.54 kg) were found, but not in aerobic endurance. Gains in fat-free mass (SMD = +0.75 kg, 95% CI = 0.44–1.06 kg) and reductions in fat-mass (SMD = −0.99, 95% CI = −1.43–0.23 kg) were greater with protein supplementation. Most studies did not report protein turnover, nitrogen balance and/or total daily protein intake. Therefore, further research is warranted. However, our findings infer that protein supplementation may support lean-mass accretion and strength gains during arduous concurrent training in physical active populations, including military recruits.
We assessed dietary intake and nitrogen balance during 14 weeks of Basic Training (BT) in British Army Infantry recruits. Nineteen men (mean ± SD: age 19.9 ± 2.6 years, height: 175.7 ± 6.5 cm, body mass 80.3 ± 10.1 kg) at the Infantry Training Centre, Catterick (ITC(C)) volunteered. Nutrient intakes and 24-h urinary nitrogen balance were assessed in weeks 2, 6 and 11 of BT. Nutrient intake was assessed using researcher-led weighed food records and food diaries, and Nutritics professional dietary software. Data were compared between weeks using a repeated-measures analysis of variance (ANOVA) with statistical significance set at p ≤ 0.05. There was a significant difference in protein intake (g) between weeks 2 and 11 of BT (115 ± 18 vs. 91 ± 20 g, p = 0.02, ES = 1.26). There was no significant difference in mean absolute daily energy (p = 0.44), fat (p = 0.79) or carbohydrate (CHO) intake (p = 0.06) between weeks. Nitrogen balance was maintained in weeks 2, 6 and 11, but declined throughout BT (2: 4.6 ± 4.1 g, 6: 1.6 ± 4.5 g, 11: −0.2 ± 5.5 g, p = 0.07). A protein intake of 1.5 g·kg−1·d−1 may be sufficient in the early stages of BT, but higher intakes may be individually needed later on in BT.
Exertional heat stroke (EHS) is a life-threatening illness and an enduring problem among athletes, military servicemen and -women, and occupational labourers who regularly perform strenuous activity, often under hot and humid conditions or when wearing personal protective equipment. Risk factors for EHS and mitigation strategies have generally focused on the environment, health status, clothing, heat acclimatization and aerobic conditioning, but the potential role of nutrition is largely underexplored. Various nutritional and dietary strategies have shown beneficial effects on exercise performance and health and are widely used by athletes and other physically active populations. There is also evidence that some of these practices may dampen the pathophysiological features of EHS, suggesting possible protection or abatement of injury severity. Promising candidates include carbohydrate ingestion, appropriate fluid intake and glutamine supplementation. Conversely, some nutritional factors and low energy availability may facilitate the development of EHS, and individuals should be cognizant of these. Therefore, the aims of this review are to present an overview of EHS along with its mechanisms and pathophysiology, discuss how selected nutritional considerations may influence EHS risk focusing on their impactThis is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
IntroductionIt is important to collate the literature that has assessed dietary intake within military settings to establish which methods are commonly used and which are valid so that accurate nutrition recommendations can be made. This scoping review aims to identify which methods are typically used to assess dietary intake in military settings and which of these have been validated. This review also aims to provide a recommendation as to which method(s) should be used in military settings.MethodsThis scoping review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews. Searches were conducted in PubMed, Web of Science and SPORTDiscus with the most recent search executed on 12th June 2020. Eligible studies had to report original data, assess and quantify dietary intake and have been published in peer-reviewed academic journals. The reporting bias was calculated for each study where possible.ResultsTwenty-eight studies used a single method to assess dietary intake and seven studies used a combination of methods. The most commonly used methods were the gold standard food intake/waste method, Food Frequency Questionnaire (FFQ) or a food diary (FD). The only method to date that has been validated in military settings is weighed food records (WFR).ConclusionsThe food intake/waste method or WFR should be used where feasible. Where this is not practical the FFQ or FD should be considered with control measures applied. There is currently not sufficient evidence to state that using multiple methods together improves validity.
Cardiorespiratory fitness is a key component of health-related fitness. It is a necessary focus of improvement, especially for those that have poor fitness and are classed as untrained. However, much research has shown individuals respond differentially to identical training programs, suggesting the involvement of a genetic component in individual exercise responses. Previous research has focused predominantly on a relatively low number of candidate genes and their overall influence on exercise responsiveness. However, examination of gene-specific alleles may provide a greater level of understanding. Accordingly, this study aimed to investigate the associations between cardiorespiratory fitness and an individual’s genotype following a field-based endurance program within a previously untrained population. Participants (age: 29 ± 7 years, height: 175 ± 9 cm, mass: 79 ± 21 kg, body mass index: 26 ± 7 kg/m2) were randomly assigned to either a training (n = 21) or control group (n = 24). The training group completed a periodized running program for 8-weeks (duration: 20-30-minutes per session, intensity: 6–7 Borg Category-Ratio-10 scale rating, frequency: 3 sessions per week). Both groups completed a Cooper 12-minute run test to estimate cardiorespiratory fitness at baseline, mid-study, and post-study. One thousand single nucleotide polymorphisms (SNPs) were assessed via saliva sample collections. Cooper run distance showed a significant improvement (0.23 ± 0.17 km [11.51 ± 9.09%], p < 0.001, ES = 0.48 [95%CI: 0.16–0.32]), following the 8-week program, whilst controls displayed no significant changes (0.03 ± 0.15 km [1.55 ± 6.98%], p = 0.346, ES = 0.08, [95%CI: -0.35–0.95]). A significant portion of the inter-individual variation in Cooper scores could be explained by the number of positive alleles a participant possessed (r = 0.92, R2 = 0.85, p < 0.001). These findings demonstrate the relative influence of key allele variants on an individual’s responsiveness to endurance training.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.