This manuscript introduces a novel framework and algorithm that leverages inertial sensors to capture and optimize data on human joint motion posture, thereby enhancing the accuracy and stability of contemporary motion capture technology. By wirelessly transmitting nine-axis motion data from joint-worn inertial sensor nodes to a host computer, real-time visualization of motion attitude data is achieved through software. Moreover, the acquired data is effectively denoised through a two-stage extended Kalman filter algorithm. Subsequently, the feasibility of capturing motion attitude using the inertial nodes, as well as the effectiveness of the proposed algorithm, is empirically validated through single-node experiments, substantially elevating the overall stability of the system.