Undergraduate research involves experiential learning methods that helps animal science students gain critical thinking skills. There is high demand for these opportunities. For example, 77.9% of incoming freshmen in the Department of Animal Sciences & Industry at Kansas State University in Fall 2017 and Fall 2018 planned to conduct research sometime during their undergraduate career (422 of 542 students). Conventional, one-on-one mentoring methods in the department were only serving 1.7% of the undergraduate population (21 of 1,212 students). This creates a unique challenge of increasing the number of undergraduate research opportunities, while maintaining the impact of individualized experiential learning. One method to address this challenge is the incorporation of a course-based research program. In this model, research projects are conducted during a conventional semester during scheduled classroom hours, with project components divided into 3 sections: (1) research preparation, including compliance requirements, hypothesis testing, experimental design, and protocol development; (2) data collection; and (3) data interpretation and dissemination. Students collect data as a team, but individually develop their own research abstract and poster to maintain a high level of experiential learning. By teaching multiple sections of this course per semester and incorporating the concepts into existing laboratories, 13.5% of students in the department completed undergraduate research in the 2018–2019 academic year (162 of 1,197 students). To monitor the quality of these experiences, student critical thinking ability was assessed using the online Critical Thinking Basic Concepts & Understanding Test (Foundation for Critical Thinking, Tomales, CA). Undergraduate research experiences increased (P = 0.028) the growth in student critical thinking score, but the type of research experience did not influence assessed skills (P > 0.281). Thus, course-based undergraduate research experiences may be an option for growing the quantity and quality of undergraduate research experience in animal science.
The objective was to characterize ham and loin quality of carcasses ranging from 78 to 145 kg (average ∼119 kg). Hot carcass weight (HCW), back fat depth, and loin depth was measured on 666 carcasses. Loin pH, instrumental and visual color and iodine value of clear plate fat (all 3 layers) was measured on approximately 90% of the population. Quality measurements of the ham, 14 d aged loin and chop, and loin chop shear force (SSF) were evaluated on approximately 30% of the population. Myosin heavy chain fiber type determination was completed on 49 carcasses. Slopes of regression lines and coefficients of determination between HCW and quality traits were calculated using the REG procedure in SAS and considered significantly different from 0 at P ≤ 0.05. As HCW increased, loin depth (b1 = 0.2496, P < 0.0001), back fat depth (b1 = 0.1374, P < 0.0001), loin weight (b1 = 0.0345, P < 0.0001), and ham weight (b1 = 0.1044, P < 0.0001) increased. Estimated lean (b1 = –0.0751, P < 0.0001) and iodine value (b1 = –0.0922, P < 0.0001) decreased as HCW increased, where HCW accounted for 24% (R2 = 0.24) of the variation in estimated lean and 7% (R2 = 0.07) of the variation in iodine value. However, HCW did not explain variation in ham quality traits (P > 0.15) and did not explain more than 1% (R2 ≤ 0.01) of the variation in 1 d loin color or pH. Loins from heavier carcasses were more tender (decreased SSF; b1 = –0.0674, P < 0.0001), although HCW only explained 9% of the variation in SSF. Hot carcass weight did not alter (P > 0.22) muscle fiber type percentage or area. These results suggest that increasing HCW to an average of 119 kg did not compromise pork quality.
Feed has been shown to be a vector for viral transmission. Four experiments were conducted to: 1) determine if medium chain fatty acids (MCFA) are effective mitigants when applied to feed both pre- and post-porcine epidemic diarrhea virus (PEDV) inoculation measured by quantitative reverse transcription polymerase chain reaction (qRT-PCR), 2) evaluate varying levels and combinations of MCFA measured by qRT-PCR, and 3) evaluate selected treatments in bioassay to determine infectivity. In exp. 1, treatments were arranged in a 2 × 2 + 1 factorial with main effects of treatment (0.3% commercial formaldehyde [CF] product, Sal CURB [Kemin Industries, Inc.; Des Moines, IA], or 1% MCFA blend (Blend) of 1:1:1 C6:C8:C10 [PMI, Arden Hills, MN]) and timing of application (pre- or post-inoculation with PEDV) plus a positive control (PC; feed inoculated with PEDV and no treatment). All combinations of treatment and timing decreased detectable PEDV compared with the PC (P < 0.05). Pre-inoculation treatment elicited decreased magnitude of PEDV detection (cycle threshold value) compared with post-inoculation (P = 0.009). Magnitude of PEDV detection was decreased for CF compared with Blend (P < 0.0001). In exp. 2, pre-inoculation treatments consisted of: 1) PC, 2) 0.3% CF, 3 to 5) 0.125% to 0.33% C6:0, 6 to 8) 0.125% to 0.33% C8:0, 9 to 11) 0.125% to 0.33% C10:0, and 12 to 15) 0.125% to 0.66% C5:0. Treating feed with 0.33% C8:0 resulted in decreased (P < 0.05) PEDV detection compared with all other treatments. Increasing concentration of each individual MCFA decreased PEDV detectability (P < 0.042). In exp. 3, pre-inoculation treatments consisted of: 1) PC, 2) 0.3% CF, 3 to 7) 0.25% to 1% Blend, 8 to 10) 0.125% to 0.33% C6:0 + C8:0, 11 to 13) 0.125% to 0.33% C6:0 + C10:0, and 14 to 16) 0.125% to 0.33% C8:0 + C10:0. Treating feed with CF, 0.5% Blend, 0.75% Blend, 1% Blend, all levels of C6:0+C8:0, 0.25% C6:0 + 0.25% C10:0, 0.33% C6:0 + 0.33% C10:0, 0.25% C8:0 + 0.25% C10:0, or 0.33% C8:0 + 0.33% C10:0 elicited decreased detection of PEDV compared with PC (P < 0.05). Increasing concentration of each MCFA combination decreased PEDV detectability (linear, P < 0.012). In exp. 4, feed was treated pre-inoculation with: 1) no treatment (PC), 2) 0.3% CF, 3) 0.5% Blend, or 4) 0.3% C8:0 and analyzed via qRT-PCR and bioassay. Adding 0.5% Blend or 0.3% C8:0 resulted in decreased PEDV compared with PC and only PC resulted in a positive bioassay. Therefore, MCFA can decrease detection of PEDV in feed. Further, inclusion of lower levels of MCFA than previously evaluated are effective against PEDV.
Feeding diets high in corn distillers dried grains with solubles (DDGS) before market can negatively impact carcass yield, hot carcass weight (HCW), and belly fat iodine value (IV). Two experiments were conducted to evaluate the effects of switching from DDGS-based to corn-soybean meal (CSBM)-based diets at increasing intervals (withdrawal periods) before harvest on finishing pig performance and carcass characteristics. Diets in both experiments contained either 0% or 30% DDGS and were balanced for net energy (NE). In Exp. 1, 985 pigs (initially 99.6 kg body weight [BW]) were used with 12 pens per treatment. The four treatments were increasing DDGS withdrawal periods: 28, 21, 14, or 0 d (no dietary switch) before marketing. All pens were marketed by removing the 17% heaviest pigs 21 d before slaughter and the remaining 83% all slaughtered 21 d later. Overall, there was no evidence for treatment differences on final BW, average daily feed intake, or feed efficiency (G:F;P > 0.10); however, average daily gain (ADG) increased (linear, P = 0.022) and belly fat IV decreased (linear, P = 0.001) the longer pigs were fed CSBM diets. There was no evidence for differences for HCW (P > 0.10); however, carcass yield increased (linear, P = 0.001) with increasing time following the switch to CSBM. Backfat depth decreased and percentage lean increased as CSBM feeding time increased (quadratic; P < 0.05). In Exp. 2, 1,158 pigs (initially 105 kg BW) were used in a 35-d study. There were 15 pens per treatment and four treatments of increasing DDGS withdrawal periods: 35, 28, 14, or 0 d (no dietary switch). All pens were marketed by removing the 15% heaviest pigs on day 28, the 28% heaviest pigs on day 14, and a final marketing of approximately 57% of starting barn inventory. There was no evidence that final BW, ADG, G:F, or HCW differed among dietary treatments (P > 0.10). Average daily feed intake and carcass yield increased and belly fat IV decreased (P < 0.050); the longer pigs were fed CSBM. In conclusion, growth performance was minimally impacted following dietary switch from DDGS- to CSBM-based diets, possibly due to similar dietary NE. For carcass yield and belly fat IV, the optimal time to make a dietary switch from high to low fiber appears to be linear in nature and at least 28 d before marketing.
Three studies evaluated the effects of added dietary salt on growth performance of pigs weighing 7 to 10, 11 to 30, and 27 to 65 kg. In experiment 1, 325 pigs were used with 5 pigs per pen and 13 pens per treatment. Pigs were fed a diet (0.39% Na and 0.78% Cl) for 7 d after weaning, then randomly assigned to diets with either 0, 0.20, 0.40, 0.60, or 0.80% added salt for 14 d. All diets were corn-soybean meal-based with 10% dried whey. Calculated Na concentrations were 0.11, 0.19, 0.27, 0.35, and 0.43% and calculated Cl concentrations were 0.23, 0.35, 0.47, 0.59, and 0.70%, respectively. Increasing salt increased (linear, P < 0.05) average daily gain (ADG) and gain to feed ratio (G:F). For ADG, the linear, quadratic polynomial (QP), and broken-line linear (BLL) models were competing with the breakpoint for the BLL at 0.59% salt. For G:F, the BLL reported a breakpoint at 0.33% while the QP indicated maximum G:F at 0.67% added salt. In experiment 2, 300 pigs were used in a 34-d trial with 5 pigs per pen and 12 pens per treatment. Pigs were weaned at 21 d of age and fed a phase 1 diet (0.50% Na and 0.67% Cl) for 11 d and then a phase 2 diet (0.35% Na and 0.59% Cl) for 14 d. Then pens of pigs were randomly assigned to corn-soybean meal-based diets containing 0.20, 0.35, 0.50, 0.65, or 0.80% added salt. Calculated dietary Na concentration were 0.10, 0.16, 0.22, 0.28, and 0.34% and calculated Cl concentrations were 0.23, 0.32, 0.41, 0.50, and 0.59%, respectively. Overall, ADG and G:F increased (quadratic, P < 0.07) with increasing added salt. For ADG, the QP and BLL had similar fit with the breakpoint for BLL at 0.51% added salt. For G:F, the BLL model predicted a break point at 0.35% added salt. In experiment 3, 1,188 pigs were used in a 44-d study with 27 pigs per pen and 11 pens per treatment. Pens of pigs were randomly assigned to corn-soybean meal-based diets containing 0.10, 0.33, 0.55, or 0.75% added salt. Calculated dietary Na concentrations were 0.10, 0.19, 0.28, and 0.36% and calculated Cl concentrations were 0.23, 0.36, 0.49, and 0.61%, respectively. Overall, there was no evidence to indicate that added salt above 0.10% of the diet affected growth. In conclusion, the BLL models suggested to maximize ADG for 7 to 10 and 11 to 30 kg pigs was 0.59% (0.34% Na and 0.58% Cl) and 0.51% added salt (0.22% Na and 0.42% Cl), respectively. There was no evidence that growth of 27 to 65 kg pigs was improved beyond 0.10% added salt (0.11% Na and 0.26% Cl).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.