The paper introduces a factory-specific SLAM algorithm that seamlessly integrates deep learning with feature point filtering to address challenges associated with inaccurate and biased positioning data in industrial environments. By carefully considering the distinct characteristics of factory settings, particularly those where AGV robots operate, our approach effectively distinguishes between dynamic and static entities commonly encountered in such environments. To achieve this, we employ a deep learning-based dynamic object detection mechanism along with a refined feature point filtering process. Initially, deep learning algorithms are utilized to identify potential dynamic objects in the scene, providing valuable prior information. Subsequently, a feature point filtering algorithm is meticulously crafted to eliminate feature points that may introduce interference. This refinement ensures a more rational removal of dynamic feature points, thereby improving the positioning precision and robustness of the visual SLAM system in dynamic factory environments. Extensive experimental results demonstrate that, when compared to ORB-SLAM2 and DS-SLAM, the proposed algorithm achieves superior positioning and mapping accuracy in factory settings. This advancement not only addresses a longstanding challenge in robotics but also represents a significant stride towards enhancing the autonomy and reliability of AGV robots in industrial applications.