In order to accurately locate personnel in underground spaces, positioning equipment is required to be mounted on wearable equipment. But the wearable inertial personnel positioning equipment moves with personnel and the phenomenon of measurement reference wobble (referred to as moving base) is bound to occur, which leads to inertial measurement errors and makes the positioning accuracy degraded. A neural network-assisted binocular visual-inertial personnel positioning method is proposed to address this problem. Using visual-inertial Simultaneous Localization and Mapping to generate ground truth information (including position, velocity, acceleration data, and gyroscope data), a trained neural network is used to regress 6-dimensional inertial measurement data from the IMU data fragment under the moving base, and a position loss function is constructed based on the regressed inertial data to reduce the inertial measurement error. Finally, using vision as the observation quantity, the point feature and inertial measurement data are tightly coupled to optimize the mechanism to improve the personnel positioning accuracy. Through the actual scene experiment, it is verified that the proposed method can improve the positioning accuracy of personnel. The positioning error of the proposed algorithm is 0.50%D, and it is reduced by 92.20% under the moving base.