The introduction of cover crops as fallow replacement in the traditional cereal-based cropping system of the Northern Great Plains has the potential to decrease soil erosion, increase water infiltration, reduce weed pressure and improve soil health. However, there are concerns this might come at the cost of reduced production in the subsequent wheat crop due to soil water use by the cover crops. To determine this risk, a phased 2-year rotation of 15 different cover crop mixtures and winter wheat/spring wheat was established at the Northern Agricultural Research Center near Havre, MT from 2012 to 2020, or four rotation cycles. Controls included fallow–wheat and barley–wheat sequences. Cover crops and barley were terminated early July by haying, grazing or herbicide application. Yields were significantly decreased in wheat following cover crops in 3 out of 8 years, up to maximum of 1.4 t ha−1 (or 60%) for winter wheat following cool-season cover crop mixtures. However, cover crops also unexpectedly increased following wheat yields in 2018, possibly due in part to residual fertilizer. Within cool-, mid- and warm-season cover crop groups, individual mixtures did not show significant differences impact on following grain yields. Similarly, cover crop termination methods had no impact on spring or winter wheat grain yields in any of the 8 years considered. Wheat grain protein concentration was not affected by cover crop mixtures or termination treatments but was decreased in winter wheat following barley. Differences in soil water content across cover crop groups were only evident at the beginning of the third cycle in one field, but important reductions were observed below 15 cm in the last rotation cycle. In-season rainfall explained 43 and 13% of the variability in winter and spring wheat yields, respectively, compared to 2 and 1% for the previous year cover crop biomass. Further economic analyses are required to determine if the integration of livestock is necessary to mitigate the risks associated with the introduction of cover crops in replacement of fallow in the Northern Great Plains.
Dormant season livestock grazing reduces reliance on harvested feeds, but typically requires protein supplementation to maintain animal performance. Individual variation in supplement intake can impact animal performance; however, it is unknown if this variation leads to individual or herd-level effects on grazing behavior, resource utilization, and grazing impacts to native rangelands. To examine effects of protein supplementation on dormant season cattle resource use and, subsequently, post-grazing habitat conditions, we examined cattle grazing behavior, resource utilization and biomass removal of vegetation on a native rangeland in Montana. A commercial herd of 272 (yr 1) and 302 (yr 2) cows grazed a 329-ha rangeland pasture from November to January. Intake of a 30% crude protein supplement was measured for each individual. Five individuals within each of six age groups were equipped with GPS collars. Time spent grazing declined with supplement intake (b = −0.05 ± 0.02; P < 0.01). Distance traveled per day had a positive asymptotic association with supplement intake (b = 0.35 ± 0.09; P < 0.01). On average, resource utilization by cattle grazing dormant season forage decreased with terrain ruggedness (b = −0.09 ± 0.03), but was unrelated to aspect, temperature and wind speed. Notably, we observed high individual variability in resource utilization for elevation, distance from supplement and water. A post-hoc analysis suggested that individual attributes (age, body weight, supplement intake) influenced cattle resource use. At moderate stocking rates, dormant season livestock grazing did not affect residual vegetation conditions (P values > 0.22). However, residual cover of forbs and litter increased with relative grazing intensity (b = 1.04 ± 0.41;b = 3.06 ± 0.89; P � 0.05). In summary, high individual variability in grazing resource utilization of cattle suggests individuallevel factors could be the dominant drivers in grazing behavior and landscape use.
Understanding the relationship of foot angle and claw set to beef cattle structural soundness will be critical to the selection of animals that fit forage-based production systems. In an effort to address concerns about foot and leg structure, the American Angus Association’s foot angle and foot claw set expected progeny differences (EPD) were developed in 2019. As a result, these relatively new EPD and associated guidelines have limited phenotypic data submitted thus far. While ample research has evaluated lameness and foot issues in the dairy breeds, less is known about the factors that affect foot structure in beef cattle. This review focuses on beef cattle foot and leg structure, selection factors that may have led to increased problems with feet and legs, and the importance of foot and leg structure in forage-based grazing production systems. Specifically, the importance of locomotion and freedom of movement in extensive rangeland environments is discussed relative to the current literature. In addition, environmental factors that may influence foot and leg structure are addressed as well as heritability of various aspects of foot and leg traits. Where possible, information gaps and research needs are identified to enhance further investigation and the improvement of foot and leg selection tools.
Clinical and subclinical trace mineral deficiencies can limit productivity in western sheep production systems. The objective of this research was to determine the proportion of ranches that supplemented with trace minerals and to quantify serum trace mineral concentrations in ram lambs after weaning across Montana with particular emphasis on Se and Zn. Serum samples (n = 214) were collected from ram lambs 8 to 10 mo of age (52.8 ± 16 kg) at 21 ranches throughout Montana and analyzed for Co, Cu, Fe, Mn, Mo, Se, and Zn. Ranches were classified as deficient, marginally deficient, adequate, or excessive by flock mean serum trace mineral concentrations. Additionally, water samples were analyzed for pertinent characteristics. The median and interquartile range of serum concentrations for each trace mineral across ranches were as follows: Co (0.41 ng/mL; 0.90 ng/mL), Cu (0.79 μg/mL; 0.24 μg/mL), Fe (153 μg/dL; 52 μg/dL), Mn (1.70 ng/mL; 0.80 ng/mL), Mo (15.3 ng/mL; 19.3 ng/mL), Se (115 ng/ mL; 97.5 ng/mL), and Zn (0.70 μg/mL, 0.19 μg/mL). Of ranches surveyed, 67% provided a trace mineral supplement. Ranches that provided supplementary trace mineral had greater serum Se concentrations (P < 0.001). The 2 most commonly deficient and marginally deficient minerals across Montana were Se (19% of ranches deficient; 23.8% of ranches marginally deficient) and Zn (9.5% of ranches deficient; 57.1% of ranches marginally deficient). Regionally, Se serum samples classified as deficient were all located in western Montana. Of ranches sampled, 40 and 35% of water samples exceeded upper desired concentrations for Na and sulfates, respectively.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.