BackgroundAdequate nutrient intake is important to support training and to optimise performance of elite athletes. Nutritional knowledge has been shown to play an important role in adopting optimal nutrition practices. The aim of the present study was to investigate the relationship between the level of nutritional knowledge and dietary habits in elite English rugby league players using the eatwell plate food categories.MethodGeneral nutritional knowledge questionnaires were collected during the Super League competitive season in the first team squad of 21 professional Rugby league players (mean age 25 ± 5 yrs, BMI 27 ± 2.4 kg/m2, experience in game 6 ± 4 yrs). According to their nutritional knowledge scores, the players were assigned to either good or poor nutritional knowledge group (n = 11, n = 10, respectively). Their dietary habits were assessment using a food frequency questionnaire.ResultsThe findings revealed that nutritional knowledge was adequate (mean 72.82%) in this group of athletes with the highest scores in dietary advice section (85.71%), followed by food groups (71.24%) and food choice (69.52%). The majority of athletes were not aware of current carbohydrate recommendations. This translated into their dietary habits as many starchy and fibrous foods were consumed only occasionally by poor nutritional knowledge group. In terms of their eating habits, the good nutritional knowledge group consumed significantly more fruit and vegetables, and starchy foods (p <.05). Nutritional knowledge was positively correlated to fruit and vegetables consumption (rs = .52, p <.05) but not to any other eatwell plate categories.ConclusionsThe study identified adequate general nutritional knowledge in professional rugby league players with the exception of recommendation for starchy and fibrous foods. Players who scored higher in nutritional knowledge test were more likely to consume more fruits, vegetables and carbohydrate-rich foods.
Iron is a functional component of oxygen transport and energy production in humans and therefore is a critically important micronutrient for sport and exercise performance. Athletes, particularly female athletes participating in endurance sport, are at increased risk of compromised iron status due to heightened iron losses through menstruation and exercise-induced mechanisms associated with endurance activity. Conventionally oral iron supplementation is used in prevention or/and treatment of iron deficiency. However, this approach has been criticised because of the side effects and increased risk of iron toxicity associated with the use of supplements. Thus, more recently there has been a growing interest in using dietary modification rather than the use of supplements to improve iron status of athletes. Dietary iron treatment methods include the prescription of an iron-rich diet, or/and haem iron-based diet, dietary advice counselling and inclusion of novel iron-rich products into the daily diet. Although studies using dietary modification are still scarce, current literature suggests that dietary iron interventions can assist in maintaining iron status in female athletes, especially during intensive training and competition. Future research should focus on the most efficient method(s) of dietary modification for improvement of iron status and whether these approaches can have a favourable impact on sports and exercise performance.
Ulcerative colitis (UC) is an inflammatory bowel disease that causes gastrointestinal lesions, bleeding, diarrhoea and nutritional complications. Insufficient nutrient intake can additionally deteriorate nutritional status. The present cross-sectional study aimed to determine whether UC patients adhere to national dietary guidelines and to assess their dietary habits. An online questionnaire (n 93) was used to assess health-related conditions, current nutritional knowledge, professional dietary guidance and food avoidance. A 24 h dietary recall (n 81) was used to assess nutrient intakes, which were then compared with the national recommended intake values. The results showed that the nutritional knowledge of participants was limited with unofficial sources being used, including websites. Numerous food groups, predominantly fibre-rich foods and fruit and vegetables, were largely avoided by the participants. Almost half of the study population eliminated foods such as dairy products to alleviate symptoms, possibly unnecessarily. Energy intakes were significantly (P,0·05) lower than the national recommended intake values in women aged 18 -65 years and men aged 18 -60 years. Fat intake exceeded the national recommended intake values (P,0·0001), at the expense of carbohydrate and fibre intakes, which were significantly (P,0·005) lower than the national recommended intake values. Protein intake was significantly high in women aged 19-50 years (P,0·00) and men aged 19-50 years (P, 0·005). Vitamin C, vitamin B 12 and Ca intake levels were overachieved by all participants (P, 0·001), while women aged 19-50 years did not achieve their dietary Fe reference nutrient intake levels (P, 0·001). Osteopaenia, osteoporosis and anaemia were reported by 12, 6 and 31 % of the participants, respectively. Findings indicate that food avoidance may contribute to nutrient deficiencies in UC patients. Low intakes of these food groups, especially during remission, are preventing patients from adhering to dietary guidelines.
BackgroundAdequate nutrient intake is critically important for achieving optimal sports performance. Like all athletes, female runners require a nutritionally balanced diet to maintain daily activities and a successful training regime. This study investigates the effects of cereal product based dietary iron intervention on iron status of recreational female runners (n = 11; 32 ± 7yr; 239 ± 153 minutes exercise/week, of which 161 ± 150 minutes running activity/week; VO2max 38 ± 4 ml/kg/min).MethodsParticipants completed a 6-week dietary intervention study. They were asked to replace their usual bread with iron-rich Teff bread as part of their daily diet. During this period, their dietary habits were assessed by multiple pass 24-hr recalls; iron status was determined by venous blood analysis for serum transferrin, serum transferrin receptor, serum ferritin, total iron-binding capacity and transferrin receptor/ferritin log index.ResultsPre-intervention a cohort of 11 female runners reported inadequate daily dietary iron intake of 10.7 ± 2.7 mg/day, which was associated with overall compromised iron status. Over a third of all participants showed depleted bodily iron stores (serum ferritin <12 μg/L). Pre-intervention macronutrient assessment revealed adequate energy, protein and fibre intakes, whilst total fat and saturated fat intake was above the recommendations at the expense of carbohydrate intake. A 6-week dietary intervention resulted in significantly higher total iron intakes (18.5 mg/day, P < 0.05) and improved iron tissue supply but not enlarged iron stores. Improvements in heamatological indices were associated with compromised baseline iron status, prolonged intervention period and increase in dietary iron intake.ConclusionDietary iron interventions using a staple cereal product offer an alternative way of improving dietary iron intake and favourable affecting overall iron status in physically active females.
Adequate nutritional status can be obtained by following a balanced diet. Hence the use of dietary supplements in populations not at risk of nutrient deficiencies is not recommended (1) . Nonetheless the dietary supplement industry is ever expanding and estimated to be worth around £385 million in the UK (2) . There is a growing concern that the increase in supplementation of the diet directly increases the incidence of improper use and accidental overdose of these nutrients (3) . The aim of this study was to determine whether the usage of supplements provide nutritional adequacy in young healthy adults and to explore the reasons for taking supplements.After obtaining ethical approval, participants (n 238, M 104, F 134) aged 18-25 from universities in the North West region were recruited via convenient and snowball sampling. Frequency, quantity and reasons of supplement consumption were assessed using a validated questionnaire (3) . Nutrient intake was assessed by a validated 3 day food diary and analysed using dietary assessment software Microdiet. Normal distribution was investigated using the Shapiro-Wilk test of normality. Statistical analyses comparing supplement users to none users were conducted using SPSS 22. Questionnaires were analysed using chi-squared test and food diaries analysed using an independent t-test. Statistical significance was set at 0·05.Findings revealed that 48 % of participants reported current use of dietary supplements. This was significantly higher in males compared to females (65 % v 35 %, respectively, P < 0·05). The main reasons for supplement use were safety of supplements, followed by provision of more energy, improved training, endurance and health outcomes. Supplement users had significantly higher intakes of Vitamin C, Magnesium and Zinc. Significantly more energy came from carbohydrates in non-users whilst significantly more energy derived from protein in supplement users.In conclusion, similarly to previous research (4) , individuals who consume dietary supplements appear to achieve higher intakes of micronutrients. However the proportions of energy derived from macronutrients was better balanced in non-users.
Previous studies have indicated university students are often exposed to stress, lack of time and financial constraints, adversely influencing nutrient intake and status (1) . Excessive calorie intake, high dietary fat intake, alcohol and fast food consumption are commonly seen in this population, compromising an optimum nutritional status (2) .The purpose of this study was to investigate quality of diet and the adequacy of energy and macronutrient intake in university students by comparing their current nutrient intake to Dietary Reference Values (DRVs).After obtaining ethical approval, participants (n 234) aged 18-24 from North West Universities were recruited in a cross-sectional study. Laboratory and anthropometric measures of nutritional status such as glucose and lipid profiles, weight, height, percentage body fat and BMI were assessed. Energy and macronutrient intake was measured using a validated 3-day diet diary and analysed using dietary assessment software Microdiet. The percentage contribution of nutrients to total energy was compared with the DRVs (3)(4)(5) . Statistical analysis including one-sample T-test, chi square test and independent samples T-test was conducted using SPSS 22 and statistical significance was set at 0·05.The average daily macronutrients intake of participants was often higher in males in comparison with females; however when macronutrient intakes were expressed in relation with energy intake and in comparison to DRVs, females had a significantly higher intake of fat, Saturated Fatty Acids (SFA), Poly Unsaturated Fatty Acids (PUFA), Mono Unsaturated Fatty Acids (MUFA), free sugar and protein; while total carbohydrate intake was lower than the DRVs. For males, total fat intake was not statistically above DRVs, however consumption of SFA, PUFA, MUFA, free sugar and protein were all higher compared to DRVs; while, carbohydrate intake was also lower than the DRVs. When energy requirement was calculated according to the latest DRVs for energy for all individuals (5) , 76·8 % of females and 78·0 % of males met their daily energy requirement and there was no variation in meeting the requirements (P > 0·05).Although higher contribution of protein to total energy intake of male participants can explain part of the gender differences, the excessive intake of free sugar, total fat and SFA especially amongst females warrants further investigation. In contrast to previous studies demonstrating females often consume a better quality of diet (1) , this study has highlighted that this is not always the case.
Bread has been a popular staple food for many years. In the UK the average individual consumes 677 g bread/week (1) . Recent investigations show downward trends in purchases of white bread, yet an increase in wholegrain and ethnic breads (1) . This factor has resulted in the development of baked products incorporating less-utilised and more-nutritious grains.Teff (Eragrostis tef ) is a small-grained cereal originating in Ethiopia and is favoured for its nutritional profile (2) . When compared with other grains, such wheat, barley and sorghum, teff is a rich source of fibre, protein, Fe, Ca and Zn (2)(3)(4) . Teff is particularly abundant in Fe; £150 mg Fe/100 g is present in teff seeds (5) . Furthermore, the fermentation process used during bread making promotes phytate breakdown, further increasing the bioavailability of Fe (6) . In the UK the average adult female consumes 8.8 mg Fe/d; notably lower than the reference nutrient intake of 14.8 mg/d (7,8) . Although, anaemia is not a major public health problem in the UK, 9 % of women of reproductive age have Hb levels < 120 g/l; the diagnostic criteria for clinical anaemia (9) . In contrast, in Ethiopia the prevalence of Fe-deficiency anaemia is lower, which is attributed to regular teff consumption (10) . The aims of the present study were therefore to incorporate teff grain into bread, to determine the Fe content, texture qualities, sensory attributes and cost of teff breads and to compare them with control wheat bread. Teff flour was incorporated into breads at the levels of 10, 20 and 30 % (w/w). Fe content of the teff and wheat flours and all breads was determined using the dipyridyl method (11) . Texture properties were evaluated using a texture analyser crumb compression test. Sensory analysis was conducted using a taste panel comprising fifty subjects.The results clearly showed that teff flour contained significantly more Fe than wheat flour (7.64 mg/100 g v. 2.54 mg/100 g; P < 0.001). Consequently, the Fe level in teff breads was significantly higher (3.13-6.01 mg/100 g) when compared with control bread (2.44 mg/100 g; P < 0.01). Texture properties (specific loaf volume, crumb firmness, shelf life and cellular structure) showed no significant differences between control, 10 and 20 % (w/w) teff breads. Only 30 % (w/w) teff bread showed significant decrease in these quality variables. Sensory evaluation showed that 20-30% (w/w) teff breads were less acceptable than control and 10 % (w/w) teff bread, which was significantly correlated with bitter aftertaste and flavour (r -0.62, P < 0.01). Cost analysis showed that for a standard 400 g loaf the 10, 20 and 30 % (w/w) teff breads cost £0.08, £0.17 and £0.25 respectively more than white wheat equivalent bread.In conclusion, this research has identified that teff flour and breads are a rich source of dietary Fe. Thus, incorporating teff bread into the daily diet may be one way to improve the Fe status of women living in the UK.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.