The present work is concerned with the problem of extracting low-frequency trend from a given time series. To solve this problem, the authors develop a nonparametric technique called empirical mode decomposition (EMD) trend filtering. A key assumption is that the trend is representable as the sum of intrinsic mode functions produced by the EMD. Based on an empirical analysis of the EMD, the authors propose an automatic procedure for selecting the requisite intrinsic mode functions. To illustrate the effectiveness of the technique, the authors apply it to simulated time series containing different types of trend, as well as real-world data collected from an environmental study (atmospheric carbon dioxide levels at Mauna Loa Observatory) and from a large-scale bicycle rental service (rental numbers of Grand Lyon Vélo'v).
Considering the problem of extracting a trend from a time series, we propose a novel approach based on empirical mode decomposition (EMD), called EMD trend filtering. The rationale is that EMD is a completely data-driven technique, which offers the possibility of estimating a trend of arbitrary shape as a sum of low-frequency intrinsic mode functions produced by the EMD. Based on an empirical analysis of EMD, an automatic procedure is proposed to select the requisite intrinsic mode functions. The performance of the EMD trend filtering is evaluated on simulated time series containing different forms of trends. Comparing furthermore to two existing techniques (ℓ1-trend filtering and Hodrick–Prescott filtering), we observe that the EMD trend filtering performs very similarly, while it does not require assumptions on the form of the trend and it is free from estimation parameters. We also illustrate the performance of the technique on the S&P 500 index, as an example of real-world time series.
Deep Convolutional Neural Networks (CNNs) have been repeatedly proven to perform well on image classification tasks. Object detection methods, however, are still in need of significant improvements. In this paper, we propose a new framework called Ventral-Dorsal Networks (VDNets) which is inspired by the structure of the human visual system. Roughly, the visual input signal is analyzed along two separate neural streams, one in the temporal lobe and the other in the parietal lobe. The coarse functional distinction between these streams is between object recognition -the "what" of the signal -and extracting location related information -the "where" of the signal. The ventral pathway from primary visual cortex, entering the temporal lobe, is dominated by "what" information, while the dorsal pathway, into the parietal lobe, is dominated by "where" information. Inspired by this structure, we propose the integration of a "Ventral Network" and a "Dorsal Network", which are complementary. Information about object identity can guide localization, and location information can guide attention to relevant image regions, improving object recognition. This new dual network framework sharpens the focus of object detection. Our experimental results reveal that the proposed method outperforms state-of-the-art object detection approaches on PASCAL VOC 2007 by 8% (mAP) and PASCAL VOC 2012 by 3% (mAP). Moreover, a comparison of techniques on Yearbook images displays substantial qualitative and quantitative benefits of VDNet. Figure 1. Primary visual cortex and two processing streams. The "ventral stream" projects into the temporal lobe, and the "dorsal stream" extends into the parietal lobe. These interacting pathways inspire the use of a "Ventral Net" and a "Dorsal Net".
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.