The release of ChatGPT, natural language based platform by OpenAI, has taken the industry by storm. It can understand and generate human-like responses to a wide range of topics with remarkable accuracy. This includes answering questions, writing essays, solving mathematics problems, writing code and even assisting with everyday tasks. However, like any other AI powered platform, it’s prone to various biases. The literature focuses on reviewing some of the biases ChatGPT has witnessed post its release. While biases can be of various types, our work focuses on addressing biases related to Race, Gender, Religious Affiliation, Political Ideology and Fairness. We try to understand how ChatGPT responds in scenarios corresponding to these biases prevalent in the real world.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.