Previously developed MR-based three-dimensional (3D) Fricke-xylenol orange (FXG) dosimeters can provide end-to-end quality assurance and validation protocols for pre-clinical radiation platforms. FXG dosimeters quantify ionizing irradiation induced oxidation of Fe ions using pre- and post-irradiation MR imaging methods that detect changes in spin-lattice relaxation rates (R = [Formula: see text]) caused by irradiation induced oxidation of Fe. Chemical changes in MR-based FXG dosimeters that occur over time and with changes in temperature can decrease dosimetric accuracy if they are not properly characterized and corrected. This paper describes the characterization, development and utilization of an empirical model-based correction algorithm for time and temperature effects in the context of a pre-clinical irradiator and a 7 T pre-clinical MR imaging system. Time and temperature dependent changes of R values were characterized using variable TR spin-echo imaging. R-time and R -temperature dependencies were fit using non-linear least squares fitting methods. Models were validated using leave-one-out cross-validation and resampling. Subsequently, a correction algorithm was developed that employed the previously fit empirical models to predict and reduce baseline R shifts that occurred in the presence of time and temperature changes. The correction algorithm was tested on R -dose response curves and 3D dose distributions delivered using a small animal irradiator at 225 kVp. The correction algorithm reduced baseline R shifts from -2.8 × 10 s to 1.5 × 10 s. In terms of absolute dosimetric performance as assessed with traceable standards, the correction algorithm reduced dose discrepancies from approximately 3% to approximately 0.5% (2.90 ± 2.08% to 0.20 ± 0.07%, and 2.68 ± 1.84% to 0.46 ± 0.37% for the 10 × 10 and 8 × 12 mm fields, respectively). Chemical changes in MR-based FXG dosimeters produce time and temperature dependent R values for the time intervals and temperature changes found in a typical small animal imaging and irradiation laboratory setting. These changes cause baseline R shifts that negatively affect dosimeter accuracy. Characterization, modeling and correction of these effects improved in-field reported dose accuracy to less than 1% when compared to standardized ion chamber measurements.