This paper addresses the issue of motion estimation on image sequences. The standard motion equation used to compute the apparent motion of image irradiance patterns is an invariance brightness based hypothesis called the optical flow constraint. Other equations can be used, in particular the extended optical flow constraint, which is a variant of the optical flow constraint, inspired by the fluid mechanic mass conservation principle. In this paper, we propose a physical interpretation of this extended optical flow equation and a new model unifying the optical flow and the extended optical flow constraints. We present results obtained for synthetic and meteorological images.
Data Assimilation is a mathematical framework used in environmental sciences to improve forecasts performed by meteorological, oceanographic or air quality simulation models. It aims to solve an evolution equation, describing the temporal dynamics, and an observation equation, linking the state vector and observations. In this article we use this framework to study a class of ill-posed Image Processing problems, usually solved by spatial and temporal regularization techniques. An approach is proposed to convert an ill-posed Image Processing problem in terms of a Data Assimilation system, solved by a 4D-Var method. This is illustrated by the estimation of optical flow from a noisy image sequence, with the dynamic model ensuring the temporal regularity of the result. The innovation of the paper concerns first, the extensive description of the tasks to be achieved for going from an image processing problem to a data assimilation description; second, the theoretical analysis of the covariance matrices involved in the algorithm; and third a specific discretisation scheme ensuring the stability of computation for the application on optical flow estimation.
Abstract. This paper describes an innovative approach to estimate motion from image observations of divergence-free flows. Unlike most stateof-the-art methods, which only minimize the divergence of the motion field, our approach utilizes the vorticity-velocity formalism in order to construct a motion field in the subspace of divergence free functions. A 4DVAR-like image assimilation method is used to generate an estimate of the vorticity field given image observations. Given that vorticity estimate, the motion is obtained solving the Poisson equation. Results are illustrated on synthetic image observations and compared to those obtained with state-of-the-art methods, in order to quantify the improvements brought by the presented approach. The method is then applied to ocean satellite data to demonstrate its performance on the real images.
Abstract. Satellite image sequences visualise the ocean surface and allow assessing its dynamics. Processing these data is then of major interest to get a better understanding of the observed processes. As demonstrated by state-of-the-art, image assimilation permits to retrieve surface motion, based on assumptions on the dynamics. In this paper, we demonstrate that a simple heuristics, such as the Lagrangian constancy of velocity, can be used and successfully replace the complex physical properties described by the Navier-Stokes equations for assessing surface circulation from satellite images. A data assimilation method is proposed that adds an acceleration term a(t) to this Lagrangian constancy equation, which summarises all physical processes other than advection. A cost function is designed that quantifies discrepancy between satellite data and model values. This cost function is minimised by the BFGS solver with a dual method of data assimilation. The result is the initial motion field and the acceleration terms a(t) on the whole temporal interval. These values a(t) model the forces, other than advection, that contribute to surface circulation. Our approach was tested on synthetic data and with Sea Surface Temperature images acquired on Black Sea. Results are quantified and compared to those of state-of-the-art methods.
Short- or mid-term rainfall forecasting is a major task with several environmental applications such as agricultural management or flood risk monitoring. Existing data-driven approaches, especially deep learning models, have shown significant skill at this task, using only rainfall radar images as inputs. In order to determine whether using other meteorological parameters such as wind would improve forecasts, we trained a deep learning model on a fusion of rainfall radar images and wind velocity produced by a weather forecast model. The network was compared to a similar architecture trained only on radar data, to a basic persistence model and to an approach based on optical flow. Our network outperforms by 8% the F1-score calculated for the optical flow on moderate and higher rain events for forecasts at a horizon time of 30 min. Furthermore, it outperforms by 7% the same architecture trained using only rainfall radar images. Merging rain and wind data has also proven to stabilize the training process and enabled significant improvement especially on the difficult-to-predict high precipitation rainfalls.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.