We estimate the impact of participating in the NZ Marsden Fund on research output trajectories, by comparing the subsequent performance of funded researchers to those who submitted proposals but were not funded. We control for selection bias using the evaluations of the proposals generated by the grant selection process. We carry out the analysis in two data frames. First we consider the researcher teams behind 1263 second-round proposals submitted 2003-2008, and look at the post-proposal publication and citation performance of the team as a whole, as a function of pre-proposal performance, the ranking of the proposal by the panel, and the funding. This estimation does not deal with individual researchers' multiple proposals and funding over time. To disentangle these effects, we consider the 1500 New Zealand researchers who appeared on any of these proposals, and estimate a model predicting annual individual performance as a function of previous performance, recent proposal activity, ranking of any recent proposals, and funding received through recent proposals. Overall, we find that funding is associated with a 6-15% increase in publications and a 22-26% increase in citation-weighted papers for research teams. For individuals, funding is associated with a 3-5% increase in annual publications, and a 5-8% increase in citation-weighted papers for 5 years after grant; however, the lag structure and persistence of this effect post-grant is difficult to pin down. Surprisingly, we find no systematic evidence that the evaluation of proposals by the Marsden system is predictive of subsequent success. We conclude that the Marsden Fund is modestly successful in increasing scientific performance, but that the selection process does not appear to be effective in discriminating among second-round proposals in terms of their likely success.
We estimate the impact of participating in the NZ Marsden Fund on research output trajectories, by comparing the subsequent performance of funded researchers to those who submitted proposals but were not funded. We control for selection bias using the evaluations of the proposals generated by the grant selection process. We carry out the analysis in two data frames. First we consider the researcher teams behind 1263 second-round proposals submitted [2003][2004][2005][2006][2007][2008], and look at the post-proposal publication and citation performance of the team as a whole, as a function of pre-proposal performance, the ranking of the proposal by the panel, and the funding. This estimation does not deal with individual researchers' multiple proposals and funding over time. To disentangle these effects, we consider the 1500 New Zealand researchers who appeared on any of these proposals, and estimate a model predicting annual individual performance as a function of previous performance, recent proposal activity, ranking of any recent proposals, and funding received through recent proposals. Overall, we find that funding is associated with a 6-15% increase in publications and a 22-26% increase in citation-weighted papers for research teams. For individuals, funding is associated with a 3-5% increase in annual publications, and a 5-8% increase in citation-weighted papers for 5 years after grant; however, the lag structure and persistence of this effect post-grant is difficult to pin down. Surprisingly, we find no systematic evidence that the evaluation of proposals by the Marsden system is predictive of subsequent success. We conclude that the Marsden Fund is modestly successful in increasing scientific performance, but that the selection process does not appear to be effective in discriminating among second-round proposals in terms of their likely success.
We estimate the impact of participating in the NZ Marsden Fund on research output trajectories, by comparing the subsequent performance of funded researchers to those who submitted proposals but were not funded. We control for selection bias using the evaluations of the proposals generated by the grant selection process. We carry out the analysis in two data frames. First we consider the researcher teams behind 1263 second-round proposals submitted [2003][2004][2005][2006][2007][2008], and look at the post-proposal publication and citation performance of the team as a whole, as a function of pre-proposal performance, the ranking of the proposal by the panel, and the funding. This estimation does not deal with individual researchers' multiple proposals and funding over time. To disentangle these effects, we consider the 1500 New Zealand researchers who appeared on any of these proposals, and estimate a model predicting annual individual performance as a function of previous performance, recent proposal activity, ranking of any recent proposals, and funding received through recent proposals. Overall, we find that funding is associated with a 6-15% increase in publications and a 22-26% increase in citation-weighted papers for research teams. For individuals, funding is associated with a 3-5% increase in annual publications, and a 5-8% increase in citation-weighted papers for 5 years after grant; however, the lag structure and persistence of this effect post-grant is difficult to pin down. Surprisingly, we find no systematic evidence that the evaluation of proposals by the Marsden system is predictive of subsequent success. We conclude that the Marsden Fund is modestly successful in increasing scientific performance, but that the selection process does not appear to be effective in discriminating among second-round proposals in terms of their likely success.
We estimate the impact of participating in the NZ Marsden Fund on research output trajectories, by comparing the subsequent performance of funded researchers to those who submitted proposals but were not funded. We control for selection bias using the evaluations of the proposals generated by the grant selection process. We carry out the analysis in two data frames. First we consider the researcher teams behind 1263 second-round proposals submitted 2003-2008, and look at the post-proposal publication and citation performance of the team as a whole, as a function of pre-proposal performance, the ranking of the proposal by the panel, and the funding. This estimation does not deal with individual researchers multiple proposals and funding over time. To disentangle these effects, we consider the 1500 New Zealand researchers who appeared on any of these proposals, and estimate a model predicting annual individual performance as a function of previous performance, recent proposal activity, ranking of any recent proposals, and funding received through recent proposals. Overall, we find that funding is associated with a 6-15% increase in publications and a 22-26% increase in citation-weighted papers for research teams. For individuals, funding is associated with a 3-5% increase in annual publications, and a 5-8% increase in citation-weighted papers for 5 years after grant; however, the lag structure and persistence of this effect post-grant is difficult to pin down. Surprisingly, we find no systematic evidence that the evaluation of proposals by the Marsden system is predictive of subsequent success. We conclude that the Marsden Fund is modestly successful in increasing scientific performance, but that the selection process does not appear to be effective in discriminating among second-round proposals in terms of their likely success.
In 2013, the Institute of Medicine (IOM), in collaboration with faculty and students from Georgetown University (GU), launched the first annual District of Columbia (DC) Regional Public Health Case Challenge. The idea for this case challenge was born when representatives from the IOM and GU met at Emory University's Global Health Case Competition in March 2013, and the DC Case Challenge is both inspired by and modeled on the Emory competition. The DC Case Challenge aims to promote interdisciplinary, problem-based learning in public health and to foster engagement with local universities and the local community. The case challenge engages graduate and undergraduate students from multiple schools, disciplines, and universities to come together to promote awareness of and develop innovative solutions for 21st century public health issues, grounded in a challenge faced by the local community. Each year the organizers and a student case-writing team develop a case based on a topic that is not only relevant in the DC area, but also has broader domestic and global resonance. Universities located in the Washington, DC, area are invited to pull together teams of three to six students enrolled in undergraduate or graduate degree programs. In an effort to promote public health dialogue among a variety of disciplines, each team is required to have at least three different schools, programs, or majors of study represented. Starting 2 weeks before the case challenge event, these teams are asked to employ critical analysis, thoughtful action, and interdisciplinary collaboration to innovate a solution to the problem presented in the case. On the day of the case challenge, teams present their proposed solution to a panel of judges composed of representatives from local DC organizations as well as other subject matter experts from disciplines relevant to the case. In addition to the panel of judges, content experts are recruited to volunteer their service as reviewers to assist the student case-writing team.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.