“…Due to they lack of multi-functional and intelligent capabilities that per- formed and without sufficient external sensing information. In addition, with the widespread application and development of collaborative robots [16,17], robots will gradually move from a traditional closed manufacturing environment to a shared space that coexists and interacts with human [18], from semi-automatic operation tasks to autonomously perform, which inevitably leads to various unpredictable anomalies, such as objects slipping, collisions between end-effector and the environment, human collisions, and system abnormalities. Therefore, in order to give the robot a longer-term autonomy and a safer human-machine collaborative environment, the robot should perform real-time multi-modal fusion modeling to achieve accurate introspection of its own movement behavior (identification of movement behaviors, abnormal monitoring, and abnormal diagnoses), and anomaly recovery are essential in the next generation of intelligent collaborative robots.…”