T he 21st century continues to be characterized by major changes to the environment and the ecosystem services upon which society depends. Anticipating and responding to these changes requires that scientists explicitly forecast future conditions in real time (Dietze et al. 2018). Ecological forecasting, like weather and epidemiological forecasting, involves integrating data and models to generate quantitative predictions of the future state of ecological systems before observations are collected. The iterative cycle of creating forecasts, evaluating them with new observations, updating the models, and then making new forecasts has the potential to accelerate learning across many ecological subdisciplines. This cycle builds on openly available data, often published soon after collection, as is increasingly common in ecological observatory networks, such as the National Ecological Observatory Network (NEON). To accelerate improvements in ecological forecasting, we designed and launched the NEON Ecological Forecasting Challenge (hereafter, "Challenge") (Figure 1), an open platform for the ecological and data science communities to forecast NEON data before they are collected.The ecological forecasting community is interested in using forecasts to advance theory (Lewis et al. 2023) and in translating forecasts for natural resource management (Enquist et al. 2017). By analyzing a catalog of forecasts developed for a range of ecological systems, spatiotemporal scales, and environmental gradients, scientists can begin to address fundamental questions in ecology. The Ecological Forecasting Initiative Research Coordination Network (EFI-RCN) -funded by the US National Science Foundation (NSF) -invites the broad ecology community to help build this catalog by forecasting NEON data. NEON is a powerful platform to support such a challenge because it provides standardized data with reported uncertainties that span a range of environmental conditions and levels of biological organization across terrestrial and freshwater systems in the US.The Challenge was designed on input from academic, government, and private sectors through workshops and working groups. We call it a "Challenge" because, despite its similarities to data science competitions (Makridakis et al. 2021), we are empowering the community to do more than just submit forecasts -we are also collaboratively developing software, training materials, and best practices. In May 2020, we launched the Challenge's design at a virtual conference with over 200 attendees (Peters and Thomas 2021). Attendees prioritized five forecasting "themes" that draw on NEON data, address open science