See article by Kawabata et al., pages 736e743 of this issue.For 6 decades, health care professionals and patients, felt "in control" of oral anticoagulation. Warfarin dosing was titrated to values of prothrombin time (PT) within a range of international normalized ratios (INRs). Lamentably, this control was, in part, an illusion. Dose adjustments were made according to art and experience to retrospectively correct abnormal INRs that had been out of range for an unknown period of time between the last 2 INR measurements. Too many patients found frequent INR measurements objectionable and often avoided testing. Dose adjustments were not governed by knowledge of the patient's personal pharmacology and the pharmacodynamic responses to dose adjustments were delayed, because the site of warfarin action is well upstream from the coagulation event in a complex cascade of clotting factors.To achieve appropriate anticoagulation, the initial warfarin dose needed to be found by trial and error to compensate for each patient's genetically-bestowed profile of warfarin metabolism and vitamin K epoxide sensitivity. When anticoagulation was established, patients' dose requirements could change abruptly under the influence of disease, drug interaction, or altered vitamin K exposure. This required skilled attention to changes in the patient's disease-therapy environment or luck in timing INR measurements to guide dose adjustments before outside factors could cause INR shifts and possibly bleeding or thrombosis. Nevertheless, with no alternate oral anticoagulation available, we had no choice but to become moderately adept at INR monitoring. With these challenges, achieving a 50% average time in therapeutic INR range was the norm to be expected.