Gout is a chronic illness that has been underappreciated and poorly understood. Despite being more prevalent than rheumatoid arthritis and lupus, 1,2 gout has traditionally lagged behind these other diseases in research and quality of care. In addition, poor clinical management of gout is associated with increased cardiovascular morbidity and mortality, chronic kidney disease, and metabolic syndrome. These associations have been described as forming part of a comorbidity cluster in need of treatment optimization. 3 The intensely painful attacks of inflammatory arthritis known as gout flares are the main clinical manifestation of the disease, and their decrease in frequency and intensity is what matters most to patients. 4 Elevated serum urate levels above the saturation point (6.8 mg/dL at 37 °C, the definition of hyperuricemia) and the deposition of monosodium urate crystals in joints and periarticular tissues lead to activation of the innate immune system and the acute arthritis characteristic of gout. 5 Since the role of the uric acid molecule was recognized in the pathogenesis of gout, an association between circulating serum urate levels and clinical manifestations of gout was assumed. However, the evidence to support this assumption has been scarce. This issue became more controversial when the clinical guidelines for gout management were updated and were contradictory to the major rheumatology recommendations, which were still based on treating to a serum urate target using urate-lowering therapies. [6][7][8] Since then, concerns about the value of serum urate as a biomarker for gout severity and treatment response have increased, despite serum urate being used for decades for this purpose in the rheumatology practice community. Consequently, the need to generate evidence about the best strategy for secondary prevention of gout clinical manifestations is a priority.In this issue of JAMA, McCormick et al used data from the UK Biobank to study the association between a baseline level of serum urate and the subsequent risk of flares among 3613 patients with gout followed up for a mean of 8.3 years. 9 The primary analysis used a reference serum urate category of less than 6 mg/dL (the treat-to-target threshold recommended by most treatment guidelines) and studied the rate ratios of gout flares associated with 1.0-mg/dL increase categories over the reference value up to 10 mg/dL or higher. The results showed that all the categories above 6 mg/dL were associated with significantly increased rate ratios of gout flares, starting with 3.37 for the 6.0 to 6.9 mg/dL category up to 11.42 for the 10 mg/dL or higher category. These increased rate ratio associations did not vary meaningfully