Dielectric measurements and interpretation, introduced with great promise in the 1980s, have not found widespread use owing to measurement limitations, moderate accuracy, and insufficient quality control. This paper presents a new-generation dielectric tool that overcomes these limitations and brings extra information for more accurate petrophysical formation-evaluation. One of the revolutionary advances offered by this tool is the continuous measurement of dielectric dispersion (variation of formation dielectric properties as a function of the frequency) at 1-in vertical resolution. The tool uses multi-spacings antenna arrays operating at multiple frequencies in the MHz to GHz range. Moreover, transmitter and receiver antennas have colocated longitudinal and transverse polarizations. The wealth of recorded data will allow interpreters to both adapt the answer product to the reservoir fluids and geology, and also to provide acquisition quality-controls and error estimates on the results. Carbonate and heavy oil reservoirs are becoming more and more important. In these two environments, the new tool provides unique answers to better characterize these reservoirs. This tool also addresses traditional dielectric formation-evaluation in fresh water and thinly laminated sands. Answer products offered by this new tool can be classified in three categories.Pore-fluid analysis from the muti-spacings high frequency measurementsHydrocarbon residual saturation and invaded zone water salinityInvasion profile: hydrocarbon saturation profile in heavy oil reservoirMatrix analysis from dielectric dispersionCarbonates: textural information (Archie m cementation factor)Shaly sands: high resolution clay volume and anisotropyGeological structure analysis from the muti-polarizations and high resolutionThin beds analysisStructural anisotropy measurement in very thin bedsGeological features extractionCarbonate classification This paper reviews the dielectric dispersion physics, then describes the tool's architecture, measurements, and data processing chain. Field test examples illustrate the enhanced interpretation potential of dielectric dispersion measurements from this tool. Introduction to dielectric dispersion physics The permittivity quantifies the sensitivity of a medium to an electric field excitation. Three main physical phenomena contribute to the permittivity: the displacement of the electronic cloud of atoms, the coherent orientation on pre-existing microscopic electric dipoles and the polarization effect at the interfaces. These phenomena are illustrated in Figure 1.
The goal of the See ColOr project is to achieve a noninvasive mobility aid for blind users that will use the auditory pathway to represent in real-time frontal image scenes. We present and discuss here two image processing methods that were experimented in this work: image simplification by means of segmentation, and guiding the focus of attention through the computation of visual saliency. A mean shift segmentation technique gave the best results, but for real-time constraints we simply implemented an image quantification method based on the HSL colour system. More particularly, we have developed two prototypes which transform HSL coloured pixels into spatialised classical instrument sounds lasting for 300 ms. Hue is sonified by the timbre of a musical instrument, saturation is one of four possible notes, and luminosity is represented by bass when luminosity is rather dark and singing voice when it is relatively bright. The first prototype is devoted to static images on the computer screen, while the second has been built up on a stereoscopic camera which estimates depth by triangulation. In the audio encoding, distance to objects was quantified into four duration levels. Six participants with their eyes covered by a dark tissue were trained to associate colours with musical instruments and then asked to determine on several pictures, objects with specific shapes and colours. In order to simplify the protocol of experiments, we used a tactile tablet, which took the place of the camera. Overall, colour was helpful for the interpretation of image scenes. Moreover, preliminary results with the second prototype consisting in the recognition of coloured balloons were very encouraging. Image processing techniques such as saliency could accelerate in the future the interpretation of sonified image scenes.
The See Color interface transforms a small portion of a coloured video image into sound sources represented by spatialised musical instruments. Basically, the conversion of colours into sounds is achieved by quantisation of the HSL colour system. Our purpose is to provide visually impaired individuals with a capability of perception of the environment in real time. In this work we present the system principles of design and several experiments that have been carried out by several blindfolded persons with See ColOr prototypes related to static pictures on a tablet and simple video images. The goal of the first experiment was to identify the colours of static pictures' main features and then to interpret the image scenes. Although learning all instrument sounds in only a training session was too difficult, participants found that colours were helpful to limit the possible image interpretations. The experiments on the analysis of static pictures suggested that the order of magnitude of the slow down factor related to the use of the auditory channel, instead of the visual channel could correspond to the order of magnitude related to the ratio of visual channel capacity to auditory channel capacity. Afterwards, two experiments based on a head mounted camera have been performed. The first experiment pertaining to object manipulation is based on the pairing of coloured socks, while the second experiment is related to outdoor navigation with the goal of following a coloured serpentine painted on the ground. The "socks" experiment demonstrated that blindfolded individuals were able to accurately match pairs of coloured socks. The same participants with the addition of a blind individual successfully followed a red serpentine painted on the ground for more than 80 meters. According to task time durations, the order of magnitude of the slow down factor related to the "socks" and "serpentine" experiments could be equal to one. From a cognitive perspective this would be consistent with the fact that these two tasks are simpler than the interpretation of image scenes.
Although retinal neural implants have considerably progressed they raise a number of questions concerning user acceptance, risk rejection and cost. For the time being we support a low cost approach based on the transmission of limited vision information by means of the auditory channel. The See ColOr mobility aid for visually impaired individuals transforms a small portion of a coloured video image into sound sources represented by spatialised musical instruments. Basically, the conversion of colours into sounds is achieved by quantisation of the HSL colour system. Our purpose is to provide blind people with a capability of perception of the environment in real time. In this work the novelty is the simultaneous sonification of colour and depth, the last parameter being coded by sound rhythm. The main drawback of our approach is that the sonification of a limited portion of a captured image involves limited perception.As a consequence, we propose to extend the local perception module by introducing a new global perception module aiming at providing the user with a clear picture of the entire scene characteristics.Finally, we present several experiments to illustrate the limited perception module, such as: (1) detecting an open door in order to go out from the office; (2) walking in a hallway and looking for a blue cabinet; (3) walking in a hallway and looking for a red tee shirt; (4) avoiding two red obstacles; (5) moving outside and avoiding a parked car. Videos of experiments are available on http://www.youtube.com/guidobologna.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.