This paper presents a novel 'outside-in' hand tracking system for Virtual Reality (VR) and Augmented Reality (AR) interactions using an external camera, specifically a webcam. While current VR Head Mounted Displays (HMDs) primarily employ 'inside-out' systems that limit hand positioning and lead to user discomfort, our proposed system offers greater freedom in hand placement and supports a natural hand posture. With the potential to engage multiple users simultaneously, the system enhances collaborative experiences in immersive 3D VR/AR spaces. The system captures the user's hand movements through a webcam, processes the frames using a Media Pipe 3D hand pose model, predicts gestures, and calculates the hand's position and orientation. The proposed model achieved a gesture prediction accuracy of 99% in testing. Furthermore, a Unity3D demonstration showcased the system's capability in replicating precise hand articulations and performing tasks such as button pressing and cube stacking. Our approach addresses both usability and inclusivity challenges, offering a more ergonomic and economical alternative for VR/AR interaction.