While research has explored the extent of gender bias and the barriers to women's inclusion on English-language Wikipedia, very little research has focused on the problem of racial bias within the encyclopedia. Despite advocacy groups' efforts to incrementally improve representation on Wikipedia, much is unknown regarding how biographies are assessed after creation. Applying a combination of web-scraping, deep learning, natural language processing, and qualitative analysis to pages of academics nominated for deletion on Wikipedia, we demonstrate how Wikipedia's notability guidelines are unequally applied across race and gender. We find that online presence predicts whether a Wikipedia page is kept or deleted for white male academics but that this metric is idiosyncratically applied for female and BIPOC academics. Further, women's pages, regardless of race, were more likely to be deemed “too soon” for Wikipedia. A deeper analysis of the deletion archives reveals that when the tag is used on a woman's biography it is done so outside of the community guidelines, referring to one's career stage rather than media/online coverage. We argue that awareness of hidden biases on Wikipedia is critical to the objective and equitable application of the notability criteria across race and gender both on the encyclopedia and beyond.