Uncertainties in ecophysiological responses to environment, such as the impact of atmospheric and soil moisture conditions on plant water regulation, limit our ability to estimate key inputs for ecosystem models. Advanced statistical frameworks provide coherent methodologies for relating observed data, such as stem sap flux density, to unobserved processes, such as canopy conductance and transpiration. To address this need, we developed a hierarchical Bayesian State-Space Canopy Conductance (StaCC) model linking canopy conductance and transpiration to tree sap flux density from a 4-year experiment in the North Carolina Piedmont, USA. Our model builds on existing ecophysiological knowledge, but explicitly incorporates uncertainty in canopy conductance, internal tree hydraulics and observation error to improve estimation of canopy conductance responses to atmospheric drought (i.e., vapor pressure deficit), soil drought (i.e., soil moisture) and above canopy light. Our statistical framework not only predicted sap flux observations well, but it also allowed us to simultaneously gap-fill missing data as we made inference on canopy processes, marking a substantial advance over traditional methods. The predicted and observed sap flux data were highly correlated (mean sensor-level Pearson correlation coefficient = 0.88). Variations in canopy conductance and transpiration associated with environmental variation across days to years were many times greater than the variation associated with model uncertainties. Because some variables, such as vapor pressure deficit and soil moisture, were correlated at the scale of days to weeks, canopy conductance responses to individual environmental variables were difficult to interpret in isolation. Still, our results highlight the importance of accounting for uncertainty in models of ecophysiological and ecosystem function where the process of interest, canopy conductance in this case, is not observed directly. The StaCC modeling framework provides a statistically coherent approach to estimating canopy conductance and transpiration and propagating estimation uncertainty into ecosystem models, paving the way for improved prediction of water and carbon uptake responses to environmental change.
Recent developments suggest that predictive modeling could begin to play a larger role not only for data analysis, but also for data collection. We address the example of efficient wireless sensor networks, where inferential ecosystem models can be used to weigh the value of an observation against the cost of data collection. Transmission costs make observations "expensive"; networks will typically be deployed in remote locations without access to infrastructure (e.g., power). The capacity to sample intensively makes sensor networks valuable, but high-frequency data are informative only at specific times and locations. Sampling intervals will range from meters and seconds to landscapes and years, depending on the process, the current states of the system, the uncertainty about those states, and the perceived potential for rapid change. Given that intensive sampling is sometimes critical, but more often wasteful, how do we develop tools to control the measurement and transmission processes? We address the potential of data collection controlled and/or supplemented by inferential ecosystem models. In a given model, the value of an observation can be evaluated in terms of its contribution to estimates of state variables and important parameters. There will be more than one model applied to network data that will include as state variables water, carbon, energy balance, biogeochemistry, tree ecophysiology, and forest demographic processes. The value of an observation will depend on the application. Inference is needed to weigh the contributions against transmission cost. Network control must be dynamic and driven by models capable of learning about both the environment and the network. We discuss application of Bayesian inference to model data from a developing sensor network as a basis for controlling the measurement and transmission processes. Our examples involve soil moisture and sap flux, but we discuss broader application of the approach, including its implications for network design.
Abstract-This paper explores integrated source-channel decoding, driven by wireless sensor network applications where correlated information acquired by the network is gathered at a destination node. The collection of coded measurements sent to the destination, called a source-channel product codeword, has redundancy due to both correlation of the measurements and the channel code used for each measurement. At the destination, source-channel (SC) decoding of this code combines decoding using (i) the deterministic structure of the channel-coded individual measurements and (ii) the probabilistic structure of a prior model, called the global model, that describes the correlation structure of the SC product codewords. We demonstrate the utility of SC decoding via MAP SC decoding experiments using a (7,4,3) Hamming code and a Gaussian global model. We also show that SC decoding can exploit even the simplest possible code, a single-parity check code, using a MAP SC decoder that integrates the parity check constraint and global model. We describe the design of a low-complexity message-passing decoder and show it can improve performance in the poor-quality channels often found in multi-hop wireless data-gathering sensor networks.
The majority of bird and bat species are incapable of carrying tags that transmit their position to satellites. Given fundamental power requirements for such communication, burdened mass guidelines and battery technology, this constraint necessitates the continued use of very high frequency (VHF) radio beacons. As such, efforts should be made to mitigate their primary deficiencies: detection range, localization time and localization accuracy. The integration of a radiotelemetry system with an unmanned aerial vehicle (UAV) could significantly improve the capacity for data collection from VHF tags. We present a UAV‐integrated radiotelemetry system that relies on open source hardware and software. Localization methods, including signal processing, bearing estimation based on principal component analysis, localization techniques and test results, are discussed. Using a low‐power beacon applicable for bats and small birds, testing showed that the improved vantage of the UAV‐radiotelemetry system (UAV‐RT) provided significantly higher received signal power compared to the low‐level flights (maximum range beyond 1.4 km). Flight testing of localization methods showed median bearing errors between 2.3° and 6.8°, with localization errors of between 5% and 14% of the distance to the tag. In a direct comparison to an experienced radiotelemetry user, the UAV‐RT system provided bearing and localization estimates with 53% less error. This paper introduces the core functionality and use methods of the UAV‐RT system, while presenting baseline localization performance metrics. An associated website hosts plans for assembly and software installation. The methods of UAV‐RT use for tag detection will be further developed in future works. For both the detection and localization problems, the mobility of a flying asset drastically reduces tracker time requirements. A 7‐min flight would be sufficient to collect five equally spaced bearing estimates over a 1‐km transect. The use of a software‐defined radio on the UAV‐RT system will allow for the simultaneous detection and localization of multiple tags.
Abstract:Modern computer microarchitectures build on well-established foundations that have encouraged a pattern of computational homogeneity that many cyberattacks depend on. We suggest that balanced ternary logic can be valuable to Internet of Things (IoT) security, authentication of connected vehicles, as well as hardware and software assurance, and have developed a ternary encryption scheme between a computer and smartcard based on public key exchange through non-secure communication channels to demonstrate the value of balanced ternary systems. The concurrent generation of private keys by the computer and the smartcard uses ternary schemes and cryptographic primitives such as ternary physical unclonable functions. While general purpose ternary computers have not succeeded in general use, heterogeneous computing systems with small ternary computing units dedicated to cryptographic functions have the potential to improve information assurance, and may also be designed to execute binary legacy codes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.