Synopsis
Over 9 million individuals donate blood annually in the US. Between 200 to 250 mg of iron is removed with each whole blood donation, reflecting losses from the hemoglobin in red blood cells. This amount represents approximately 25% of the average iron stores in men and almost 75% of the iron stores in women. Replenishment of iron stores takes many months, leading to a high rate of iron depletion, especially in frequent blood donors (e. g., more than 2 times per year). In large epidemiologic studies, donation frequency, female gender, and younger age (reflecting menstrual status), are particularly associated with iron depletion. Currently, a minimum capillary hemoglobin of 12.5 gm/dl is the sole requirement for donor qualification in the US as far as iron levels are concerned, yet it is known that hemoglobin level is a poor surrogate for low iron. In an effort to better identify and prevent iron deficiency, blood collection centers are now considering various strategies to manage donor iron loss, including changes in acceptable hemoglobin level, donation interval, donation frequency, testing of iron status, and iron supplementation. This chapter highlights laboratory and genetic tests to assess the iron status of blood donors and their applicability as screening tests for blood donation.