Sustainability of forage production in the Northeast USA is affected by environmental and climatic variability. Complex forage mixtures may be better adapted than simple mixtures to variable environments and produce greater dry matter (DM) yield more evenly throughout the growing season, thereby increasing sustainability of forage production. A grazing trial was set up to evaluate forage production, nutritive value, and botanical composition dynamics of well-adapted and commonly sown forage species. The forage treatments consisted of simple mixtures (two and three species) and complex mixtures (six and nine species). The experiment was mob-grazed with cow-calf (Bos taurus L.) pairs five times each year. Dry matter distribution during the growing season was independent of mixture complexity; it was, instead, influenced mainly by the weather. When averaged across all 3 yr, mixtures containing six species produced greater (P , 0.001) forage yield (9900 kg DM ha 21) compared with two-species (8700 kg DM ha 21) or three-species mixtures (8400 kg DM ha 21). However, forage production varied within species richness groups. In general, regardless of the initial botanical composition, the predominant species in most mixtures by the end of the experiment were orchardgrass (Dactylis glomerata L.), tall fescue (Festuca arundinacea Schreb.), and white clover (Trifolium repens L.). Variation in nutritive value among mixtures was explained mainly by variation in the proportions of grasses and legumes. We conclude that when it comes to large yields and top nutritive value, the most important consideration is the individual species, not the complexity of the mixtures.
A nondestructive method to determine total C and N concentrations in soil size fractions is desirable when a limited sample is available. Near‐infrared reflectance spectroscopy (NIRS) was used to determine the total C and N concentrations in silt (50.0–2.0 µm) and coarse clay (2.0–0.2 µm), separated from 12 surface soils, by regressing the diffuse reflectance of near‐infrared radiation with constituent concentrations determined using combustion techniques. The correlation coefficients (R2) of the calibration equations were 0.93 for C and 0.89 for N, and the standard errors associated with NIRS predictions were 6.2 g kg−1 soil for total C and 0.6 g kg−1 for total N. Equation development with only silt samples improved the accuracy of NIRS calibration equations. Coefficients of variation [CV = (standard error of performance ÷ mean of the combustion procedure) × 100] for validation sample sets ranged from 14 to 19%, which is within the acceptable range for determining inorganic elements in plant tissues. We conclude that NIRS can be used to predict C and N concentrations in soil size fractions.
There has been considerable activity in breeding orchardgrass (Dactylis glomerata L.) cultivars in North America during the latter half of the 20th century, but little effort devoted to quantification of breeding progress. The objectives of this study were to quantify changes in mean cultivar performance for that time compared with the progress achieved from one cycle of half‐sib progeny selection within the USDA population of orchardgrass accessions. Forty‐two cultivars (32 North American cultivars and 10 European cultivars) were tested at three locations (Arlington, WI and Rock Springs, PA, and Ottawa, Ontario, Canada) in 1995 through 1997. Cultivars were grouped into three experiments by maturity class: early, medium, and late. North American cultivars averaged 3, 9, and 12% higher in forage yield than European cultivars for early, medium, and late maturity groups, respectively. Between 1955 and 1997, forage yield and ground cover of early‐maturity cultivars increased by 2.5 Mg ha−1 decade−1 and 4.0% decade−1, respectively. Forage nutritional value of medium‐maturity cultivars increased during that time, although this was probably not due to direct selection. Significant gains were made in forage yield and Drechslera spp. leafspot reaction of cultivars derived from two individual breeding programs, although the majority of orchardgrass cultivars lack improvements in forage traits.
Predictive equations for alfalfa quality (PEAQ) based on height of the tallest stem and maturity stage of the most mature stem in a sample were developed to estimate neutral‐detergent fiber (NDF) and acid‐detergent fiber (ADF) concentrations in alfalfa (Medicago sativa L.). Field testing of these equations is limited outside the state of Wisconsin where they were developed. Our objectives were to test these equations for estimating alfalfa NDF and ADF across a wide geographic area and to evaluate the performance of PEAQ on a whole‐field basis by using within‐field subsampling. Alfalfa samples varying in height and maturity were collected throughout the growing season from fields in New York (n = 28), Pennsylvania (n = 23), Ohio (n = 48), California (n = 45), and Wisconsin (n = 48) in 1994 to 1996. Additional samples were collected in Ohio and Wisconsin from producer‐managed fields in which 5 to 10 subsamples per field were taken on each sampling date (n = 296 subsamples from 51 fields). Observed NDF and ADF values were regressed on estimated values. The accuracy of PEAQ in other states was at least equal to that observed in Wisconsin. Across all states, regression equations for NDF and ADF were slightly biased (b ≠ 1.0 and/or y‐intercept ≠ 0 at P < 0.01); however, prediction errors were sufficiently low to allow use of PEAQ as a preharvest management tool. Root mean square error values ranged from 19.1 to 23.9 g kg−1 for NDF and 15.0 to 19.0 g kg−1 for ADF. Prediction errors were 16.2 g kg−1 for NDF and 13.2 g kg−1 for ADF across Ohio and Wisconsin when regressing observed means on estimated means of five subsamples per fieldsampling date combination. We conclude that predictive equations for alfalfa quality based on a combination of stem height and maturity were robust across a wide range of environments.
Alfalfa (Medicago sativa L.) seeding rates greatly affect the number of surviving plants after 1 yr. The objective of this research was to determine what effect seeding rate has on alfalfa stand density two or more years after seeding. Different alfalfa cultivars were spring‐seeded at rates ranging from 3 to 27 kg ha−1 pure live seed into tilled seedbeds in Missouri and Pennsylvania to provide eight location‐years of data. After the seeding year, herbage was removed four or five times each year. Stand densities were determined one to three and five to eight months after planting (MAP) and annually in the spring thereafter for up to 7 yr after planting. Increasing seeding rates resulted in near linear increases in plant densities from 100 to 800 plants m−2 within 3 MAP. Higher plant densities experienced eight times higher plant deaths the first year after planting compared with lower plant densities. At all eight location‐years of this research, the period 24 to 36 MAP had the least amount of plant deaths regardless of seeding rate. Higher plant densities associated with seeding rates greater than 17 kg ha−1 did not persist beyond 6 MAP. Seeding rates of 10 and 17 kg ha−1 had similar plant densities by 24 MAP in 75% of the location‐years although further reductions in seeding rate reduced plant density for up to 4 yr after planting. Seeding rates greater or slightly less than those recommended have little to no effect on the life expectancy of an alfalfa stand.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.