We describe a new class of radiation sensor that utilizes optical interferometry to measure radiation-induced changes in the optical refractive index of a semiconductor sensor medium. Radiation absorption in the sensor material produces a transient, non-equilibrium, electron-hole pair distribution that locally modifies the complex, optical refractive index of the sensor medium. Changes in the real (imaginary) part of the local refractive index produce a differential phase shift (absorption) of an optical probe used to interrogate the sensor material. In contrast to conventional radiation detectors where signal levels are proportional to the incident energy, signal levels in these optical sensors are proportional to the incident radiation energy flux. This allows for reduction of the sensor form factor with no degradation in detection sensitivity. Furthermore, since the radiation induced, non-equilibrium electron-hole pair distribution is effectively measured "in place" there is no requirement to spatially separate and collect the generated charges; consequently, the sensor risetime is of the order of the hot-electron thermalization time ≤ 10 fs and the duration of the index perturbation is determined by the carrier recombination time which is of order ~ 600 fs in, direct-bandgap semiconductors, with a high density of recombination defects; consequently, the optical sensors can be engineered with sub-ps temporal response. A series of detectors were designed, and incorporated into Mach Zehnder and Fabry-Perot interferometer-based detection systems: proof of concept, lower detection sensitivity, Mach-Zehnder detectors were characterized at beamline 6.3 at SSRL; three generations of high sensitivity single element and imaging Fabry-Perot detectors were measured at the LLNL Europa facility. Our results indicate that this technology can be used to provide xray detectors and xray imaging systems with single xray sensitivity and S/N ~ 30 at xray energies ~ 10 keV.