Background and Objectives: Understanding the impact of red blood cell (RBC) lifespan, initial RBC removal, and transfusion intervals on patient haemoglobin (Hb) levels and total iron exposure is not accessible for chronic transfusion scenarios. This article introduces the first model to help clinicians optimize chronic transfusion intervals to minimize transfusion frequency.Materials and Methods: Hb levels and iron exposure from multiple transfusions were calculated from Weibull residual lifespan distributions, the fraction effete RBC removed within 24-h (X e ) and the nominal Hb increment. Two-unit transfusions of RBCs initiated at patient [Hb] = 7 g/dl were modelled for different RBC lifespans and transfusion intervals from 18 to 90 days, and X e from 0.1 to 0.5.
Results:Increased X e requires shorter transfusion intervals to achieve steady-state[Hb] of 9 g/dl as follows: 30 days between transfusions at X e = 0.5, 36 days at X e = 0.4, 42 days at X e = 0.3, 48 days at X e = 0.2 and 54 days at X e = 0.1. The same transfusion interval/X e pairs result in a steady-state [Hb] = 8 g/dl when the RBC lifespan was halved. By reducing transfused RBC increment loss from 30% to 10%, annual transfusions were decreased by 22% with iron addition decreased by 24%.Acute dosing of iron occurs at the higher values of X e on the day after a transfusion event.Conclusion: Systematic trends in fractional Hb incremental loss X e have been modelled and have a significant and calculatable impact on transfusion intervals and associated introduction of iron.