This letter proposes a novel human-robot coadaptation framework for robust and accurate user intent recognition, specifically in the context of automatic control in assistance robots such as neural prosthetics and rehabilitation devices empowered by electrophysiological signals. Our goal is to incorporate user adaptability early in the training phase to facilitate both machine recognition and user adaptability, rather than relying solely on brute-force machine learning methods. The proposed framework is featured by applying biofeedback-based user adaptive behavior into model training, while the machine can adapt to those changes through online learning. Specifically, this study focuses on the recognition of two-degree-of-freedom simultaneous and continuous wrist movement intentions based on surface electromyogram (sEMG) array signals, and the performance is tested on twelve able-bodied subjects. The coadaptive evaluation experiment demonstrates the robust control of this method by introducing sEMG electrode displacement as perturbations. Experimental results show that this method improves the completion time of centre-out tasks by 13% compared to conventional methods (Cohen's d=0.637), and debias 86% of the effect of electrode shift perturbations. This study provides insights into the potential for incorporating human adaptability into machine intelligence to improve user intent recognition and automatic robot control.