Background: Microscopes form projected images from illuminated objects, such as cellular tissue, which are recorded at a distance through the optical system's field of view. A telescope on a satellite or airplane also forms images with a similar optical projection of objects on the ground. Typical visible illuminations form a displayed set of three-color channels (Red Green Blue [RGB]) that are combined from three image sensor arrays (e.g., focal plane arrays) into a single pixel coding for each color present in the image. Analysis of these RGB color images develops a qualitative image representation of the objects. Methods: Independent component analysis (ICA) is used for analysis and enhancement of multispectral images, and compared with the similar and widely used principal component analysis.
This document was printed on recycled paper. Abstract IntroductionA vast amount of hyperspectral data is being collected for remote sensing applications. One of the many uses for this data is locating, identifying, and quantifying chemical plumes for evaluating environmental contributions at manufacturing sites. The analysis goal is finding pixels containing a gas with a given spectral signature in a hyperspectral image (HSI) data cube. A review and comparison of methods is provided by Young (2002). The 3 Dimensional Fast Fourier Transform Matched Filter (3DFFTMF), is a variant of the whitened matched filter (WMF) which, in general, finds a known pattern in a dataset with correlated noise. This correlated noise, uninteresting variation in the data that occurs in multiple spectral bands, is evident in Fig. 1. As is typical, these spectral bands (layers) are highly correlated, contain correlated clutter.
In 2005, the Government Accountability Office (GAO) performed a review of the past U.S. responses to Anthrax terrorist cases. Part of the motivation for this PNNL report was a major GAO finding that there was a lack of validated sampling strategies in the U.S. response to Anthrax cases. The report (GAO 2005) 1 recommended that probability-based methods be used for sampling design in order to address confidence in the results, particularly when all sample results showed no remaining contamination. The GAO also expressed a desire that the methods be validated, which is the main purpose of this PNNL report. The objective of this study was to validate probability-based statistical sampling designs and the algorithms pertinent to within-building sampling that allow an investigator to prescribe or evaluate confidence levels of conclusions based on data collected as guided by the statistical sampling designs. Specifically, the designs found in the Visual Sample Plan (VSP) software were evaluated. VSP was used to calculate the number of samples and the sample location for a variety of sampling plans applied to an actual release site.Most of the sampling designs validated are probability based, meaning samples are located randomly (or on a randomly placed grid) and the number of samples is calculated such that if the amount and spatial extent of contamination exceeds levels of concern, at least one of the samples would be taken from a contaminated area, at least X% of the time. Hence, "validation" of the statistical sampling algorithms is defined herein to mean ensuring that the "X%" (confidence) is actually met.The validation effort focused on four VSP sampling designs based on the following sampling objectives that were deemed pertinent for sampling within a building after a chemical or biological attack.• Upper Tolerance Limit Based Sampling -Statement that X% confident that at least Y% of surface area is below some quantitative contaminant limit where only random samples are obtained.• Compliance Sampling -Statement that X% confident that at least Y% of surface area contains no detectable contamination where only random samples are obtained. For Official Use Only iv• Combined Judgment and Random Sampling -Statement that X% confident that at least Y% of surface area contains no detectable contamination where both random and judgmental samples are obtained.• Hotspot Sampling -Statement that at least X% confident that any contaminated area greater than a given size and shape is sampled.Validation was accomplished by first creating a "ground truth" building and data set. The ground truth building and contaminant distribution was based on data from an actual building where a simulant had been released (Coronado Club in Albuquerque, NM). Contaminant estimates were derived for each 0.3 m x 0.3 m grid within the building using geostatistical modeling methods. Contaminant action levels were then varied to produce different ground-truth scenarios to make parts of the building more or less "contaminated," thereby changing th...
In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.