Gamified psychological interventions designed to improve detection of online misinformation are becoming increasingly prevalent. Two of the most notable interventions of this kind are Bad News and Go Viral!. To assess their efficacy, prior research has typically used pre-post designs in which participants rated the reliability or manipulativeness of true and fake news items before and after playing these games, while most of the time also including a control group who played an irrelevant game (i.e., Tetris) or did nothing at all. Mean ratings were then compared between pre-tests and post-tests and/or between the control and experimental conditions. Critically, these prior studies have not separated response bias effects (overall tendency to respond “true” or “fake”) from discrimination (ability to distinguish between true and fake news, commonly dubbed discernment). We reanalyzed the results from five prior studies using receiver operating characteristic (ROC) curves, a method unique to signal detection theory (SDT) that allows for discrimination to be measured free from response bias. Across the studies, when comparable true and fake news items were used, we found that Bad News and Go Viral! did not improve discrimination, but rather elicited more conservative responding (i.e., responding “true” less often). These novel findings suggest that the current gamified psychological interventions designed to improve fake news detection are not as effective as previously thought. They also demonstrate the usefulness of ROC analysis, a largely unexploited method in this setting, for assessing the effectiveness of any intervention designed to improve fake news detection.