Functional performance of lower limb muscles and contractile properties of chemically skinned single muscle fibers were evaluated before and after 8 wk of maximal effort stretch-shortening cycle (SSC) exercise training. Muscle biopsies were obtained from the vastus lateralis of eight men before and after the training period. Fibers were evaluated regarding their mechanical properties and subsequently classified according to their myosin heavy chain content (SDS-PAGE). After training, maximal leg extensor muscle force and vertical jump performance were improved 12% (P<0.01) and 13% (P<0.001), respectively. Single-fiber cross-sectional area increased 23% in type I (P<0.01), 22% in type IIa (P<0.001), and 30% in type IIa/IIx fibers (P<0.001). Peak force increased 19% in type I (P<0.01), 15% in type IIa (P<0.001), and 16% in type IIa/IIx fibers (P<0.001). When peak force was normalized with cross-sectional area, no changes were found for any fiber type. Maximal shortening velocity was increased 18, 29, and 22% in type I, IIa, and hybrid IIa/IIx fibers, respectively (P<0.001). Peak power was enhanced in all fiber types, and normalized peak power improved 9% in type IIa fibers (P<0.05). Fiber tension on passive stretch increased in IIa/IIx fibers only (P<0.05). In conclusion, short-term SSC exercise training enhanced single-fiber contraction performance via force and contraction velocity in type I, IIa, and IIa/IIx fibers. These results suggest that SSC exercises are an effective training approach to improve fiber force, contraction velocity, and therefore power.
In elderly populations, frailty is associated with higher mortality risk. Although many frailty scores (FS) have been proposed, no single score is considered the gold standard. We aimed to evaluate the agreement between a wide range of FS in the English Longitudinal Study of Ageing (ELSA). Through a literature search, we identified 35 FS that could be calculated in ELSA wave 2 (2004–2005). We examined agreement between each frailty score and the mean of 35 FS, using a modified Bland-Altman model and Cohen's kappa (κ). Missing data were imputed. Data from 5,377 participants (ages ≥60 years) were analyzed (44.7% men, 55.3% women). FS showed widely differing degrees of agreement with the mean of all scores and between each pair of scores. Frailty classification also showed a very wide range of agreement (Cohen's κ = 0.10–0.83). Agreement was highest among “accumulation of deficits”-type FS, while accuracy was highest for multidimensional FS. There is marked heterogeneity in the degree to which various FS estimate frailty and in the identification of particular individuals as frail. Different FS are based on different concepts of frailty, and most pairs cannot be assumed to be interchangeable. Research results based on different FS cannot be compared or pooled.
Midsole hardness of modern cushioned running shoes does not seem to influence RRI risk.
The effect of a runner's training load on running-related injury is influenced by body mass index and previous injury. These results show the importance to distinguish between confounding and effect-measure modification in running-related injury research.
Background/aimThis randomised controlled trial investigated if the usage of running shoes with a motion control system modifies injury risk in regular leisure-time runners compared to standard shoes, and if this influence depends on foot morphology.MethodsRecreational runners (n=372) were given either the motion control or the standard version of a regular running shoe model and were followed up for 6 months regarding running activity and injury. Foot morphology was analysed using the Foot Posture Index method. Cox regression analyses were used to compare injury risk between the two groups, based on HRs and their 95% CIs, controlling for potential confounders. Stratified analyses were conducted to evaluate the effect of motion control system in runners with supinated, neutral and pronated feet.ResultsThe overall injury risk was lower among the participants who had received motion control shoes (HR=0.55; 95% CI 0.36 to 0.85) compared to those receiving standard shoes. This positive effect was only observed in the stratum of runners with pronated feet (n=94; HR=0.34; 95% CI 0.13 to 0.84); there was no difference in runners with neutral (n=218; HR=0.78; 95% CI 0.44 to 1.37) or supinated feet (n=60; HR=0.59; 95% CI 0.20 to 1.73). Runners with pronated feet using standard shoes had a higher injury risk compared to those with neutral feet (HR=1.80; 95% CI 1.01 to 3.22).ConclusionsThe overall injury risk was lower in participants who had received motion control shoes. Based on secondary analysis, those with pronated feet may benefit most from this shoe type.
The etiological mechanism underpinning any sports-related injury is complex and multifactorial. Frequently, athletes perceive "excessive training" as the principal factor in their injury, an observation that is biologically plausible yet somewhat ambiguous. If the applied training load is suddenly increased, this may increase the risk for sports injury development, irrespective of the absolute amount of training. Indeed, little to no rigorous scientific evidence exists to support the hypothesis that fluctuations in training load, compared to absolute training load, are more important in explaining sports injury development. One reason for this could be that prospective data from scientific studies should be analyzed in a different manner. Time-to-event analysis is a useful statistical tool in which to analyze the influence of changing exposures on injury risk. However, the potential of time-to-event analysis remains insufficiently exploited in sports injury research. Therefore, the purpose of the present article was to present and discuss measures of association used in time-to-event analyses and to present the advanced concept of time-varying exposures and outcomes. In the paper, different measures of association, such as cumulative relative risk, cumulative risk difference, and the classical hazard rate ratio, are presented in a nontechnical manner, and suggestions for interpretation of study results are provided. To summarize, time-to-event analysis complements the statistical arsenal of sports injury prevention researchers, because it enables them to analyze the complex and highly dynamic reality of injury etiology, injury recurrence, and time to recovery across a range of sporting contexts.
Background: Shoe cushioning is expected to protect runners against repetitive loading of the musculoskeletal system and therefore running-related injuries. Also, it is a common belief that heavier runners should use footwear with increased shock absorption properties to prevent injuries. Purpose: The aim of this study was to determine if shoe cushioning influences the injury risk in recreational runners and whether the association depends on the runner’s body mass. Study Design: Randomized controlled trial; Level of evidence, 1. Methods: Healthy runners (n = 848) randomly received 1 of 2 shoe prototypes that only differed in their cushioning properties. Global stiffness was 61.3 ± 2.7 and 94.9 ± 5.9 N/mm in the soft and hard versions, respectively. Participants were classified as light or heavy according to their body mass using the median as a cut-off (78.2 and 62.8 kg in male and female runners, respectively). They were followed over 6 months regarding running activity and injury (any physical complaint reducing/interrupting running activity for at least 7 days). Data were analyzed through time-to-event models with the subhazard rate ratio (SHR) and their 95% confidence interval (CI) as measures of association. A stratified analysis was conducted to investigate the effect of shoe cushioning on the injury risk in lighter and heavier runners. Results: The runners who had received the hard shoes had a higher injury risk (SHR, 1.52 [95% CI, 1.07-2.16]), while body mass was not associated with the injury risk (SHR, 1.00 [95% CI, 0.99-1.01]). However, after stratification according to body mass, results showed that lighter runners had a higher injury risk in hard shoes (SHR, 1.80 [95% CI, 1.09-2.98]) while heavier runners did not (SHR, 1.23 [95% CI, 0.75-2.03]). Conclusion: The injury risk was higher in participants running in the hard shoes compared with those using the soft shoes. However, the relative protective effect of greater shoe cushioning was found only in lighter runners. Registration: NCT03115437 (ClinicalTrials.gov identifier)
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.