The Copernicus Marine Environment Monitoring Service (CMEMS) provides regular and systematic reference information on the physical and biogeochemical ocean and sea-ice state for the global ocean and the European regional seas. CMEMS serves a wide range of users (more than 15,000 users are now registered to the service) and applications. Observations are a fundamental pillar of the CMEMS value-added chain that goes from observation to information and users. Observations are used by CMEMS Thematic Assembly Centres (TACs) to derive high-level data products and by CMEMS Monitoring and Forecasting Centres (MFCs) to validate and constrain their global and regional ocean analysis and forecasting systems. This paper presents an overview of CMEMS, its evolution, and how the value of in situ and satellite observations is increased through the generation of high-level products ready to be used by downstream applications and services. The complementary nature of satellite and in situ observations is highlighted. Le Traon et al. Copernicus Marine Service: Observations Long-term perspectives for the development of CMEMS are described and implications for the evolution of the in situ and satellite observing systems are outlined. Results from Observing System Evaluations (OSEs) and Observing System Simulation Experiments (OSSEs) illustrate the high dependencies of CMEMS systems on observations. Finally future CMEMS requirements for both satellite and in situ observations are detailed.
Abstract.A tool for multidimensional variational analysis (divand) is presented. It allows the interpolation and analysis of observations on curvilinear orthogonal grids in an arbitrary high dimensional space by minimizing a cost function. This cost function penalizes the deviation from the observations, the deviation from a first guess and abruptly varying fields based on a given correlation length (potentially varying in space and time). Additional constraints can be added to this cost function such as an advection constraint which forces the analysed field to align with the ocean current. The method decouples naturally disconnected areas based on topography and topology. This is useful in oceanography where disconnected water masses often have different physical properties. Individual elements of the a priori and a posteriori error covariance matrix can also be computed, in particular expected error variances of the analysis. A multidimensional approach (as opposed to stacking two-dimensional analysis) has the benefit of providing a smooth analysis in all dimensions, although the computational cost is increased.Primal (problem solved in the grid space) and dual formulations (problem solved in the observational space) are implemented using either direct solvers (based on Cholesky factorization) or iterative solvers (conjugate gradient method). In most applications the primal formulation with the direct solver is the fastest, especially if an a posteriori error estimate is needed. However, for correlated observation errors the dual formulation with an iterative solver is more efficient.The method is tested by using pseudo-observations from a global model. The distribution of the observations is based on the position of the Argo floats. The benefit of the threedimensional analysis (longitude, latitude and time) compared to two-dimensional analysis (longitude and latitude) and the role of the advection constraint are highlighted. The tool divand is free software, and is distributed under the terms of the General Public Licence (GPL)
[1] An innovative multi-model fusion technique is proposed to improve short-term ocean temperature forecasts: the threedimensional super-ensemble. In this method, a Kalman Filter is used to adjust three-dimensional model weights over a past learning period, allowing to give more importance to recent observations, and take into account spatially varying model skills. The predictive performance is evaluated against SST analyses, CTD casts and gliders tracks collected during the Ligurian Sea Cal/Val 2008 experiment. Statistical results not only show a very significant bias reduction of this multimodel forecast in comparison with the individual models, their ensemble mean and a single-weight-per-model version of the super-ensemble, but also the improvement of other pattern-related skills. In a 48-h forecast experiment, and with respect to the ensemble mean, surface and subsurface rootmean-square differences with observations are reduced by 57% and 35% respectively, making this new technique a suitable non-intrusive post-processing method for multimodel operational forecasting systems.
All‐sky, multicolour, medium deep (V≃ 20) surveys have the potentiality of detecting several hundred thousands of quasi‐stellar objects (QSOs). Spectroscopic confirmation is not possible for such a large number of objects, so that secure photometric identification and precise photometric determination of redshifts (and other spectral features) become mandatory. This is especially the case for the Gaia mission, in which QSOs play the crucial role of fixing the celestial referential frame, and in which more than 900 gravitationally lensed QSOs should be identified.
We first built two independent libraries of synthetic QSO spectra reflecting the most important variations in the spectra of these objects. These libraries are publicly available for simulations with any instrument and photometric system.
Traditional template fitting and artificial neural networks (ANNs) are compared to identify QSOs among the population of stars using broad‐ and medium‐band photometry (BBP and MBP, respectively). Besides those two methods, a new one, based on the spectral principal components (SPCs), is also introduced to estimate the photometric redshifts. Generic trends as well as results specifically related to Gaia observations are given.
We found that (i) ANNs can provide clean, uncontaminated QSO samples suitable for the determination of the reference frame, but with a level of completeness decreasing from ≃50 per cent at the Galactic pole at V= 18 to ≃ 16 per cent at V= 20; (ii) the χ2 approach identifies about 90 per cent (60 per cent) of the observed QSOs at V= 18 (V= 20), at the expense of a higher stellar contamination rate, reaching ≃95 per cent in the galactic plane at V= 20. Extinction is a source of confusion and makes difficult the identification of QSOs in the galactic plane and (iii) the χ2 method is better than ANNs to estimate the photometric redshifts. Due to colour degeneracies, the largest median absolute error (|Δz|Median≃ 0.2) is predicted in the range 0.5 < zspec < 2. The method based on the SPCs is promisingly good at recovering the redshift, in particular for V < 19 and z < 2.5 QSOs. For bright (V≲ 18) QSOs, SPCs are also able to recover the spectral shape from the BBP and MBP data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.