In recent years, virtual reality (VR) technology has grown by leaps and bounds, transforming all walks of life. At the same time, American sign language (ASL), a visual language, plays a crucial role in communication for people who are deaf, people with autism spectrum disorders, and people with speech and language disorders. In response to advances in these two fields, this thesis presents a new approach to ASL gesture recognition utilizing the Oculus Quest 2 VR headset. The application recognizes 15 ASL one-handed static letter gestures and includes a user-friendly interface with instructional support. In a study involving 15 participants, most gestures were recognized within 1-10 seconds and completed within 1-5 attempts. The aim is to adapt the approach outlined in this thesis to a range of applications in order to encourage the utilization of ASL and improve the quality of life for the hearing impaired community.