Brain machine interfaces (BMIs) have demonstrated lots of successful arm-related reach decoding in past decades, which provide a new hope for restoring the lost motor functions for the disabled. On the other hand, the more sophisticated hand grasp movement, which is more fundamental and crucial for daily life, was less referred. Current state of arts has specified some grasp related brain areas and offline decoding results; however, online decoding grasp movement and real-time neuroprosthetic control have not been systematically investigated. In this study, we obtained neural data from the dorsal premotor cortex (PMd) when monkey reaching and grasping one of four differently shaped objects following visual cues. The four grasp gesture types with an additional resting state were classified asynchronously using a fuzzy k-nearest neighbor model, and an artificial hand was controlled online using a shared control strategy. The results showed that most of the neurons in PMd are tuned by reach and grasp movement, using which we get a high average offline decoding accuracy of 97.1%. In the online demonstration, the instantaneous status of monkey grasping could be extracted successfully to control the artificial hand, with an event-wise accuracy of 85.1%. Overall, our results inspect the neural firing along the time course of grasp and for the first time enables asynchronous neural control of a prosthetic hand, which underline a feasible hand neural prosthesis in BMIs. The loss of the hand results in a serious reduction of the functional autonomy of a person in his daily living. Prosthesis is the most common way to restore the lost function, however, there are several barriers existing trough a successful prosthesis use, and the most important seems regarding the development of a reliable interface capable to decode the intention of the disabled to the prosthetic device. Brain machine interfaces (BMIs) provide a new hope for restoring motor functions of the severely disabled through controlling prostheses with intentional commands extracted from brain signals. Decoding motor cortex activities for robotic arm and screen cursor control in two or three dimensions has been examined successfully in human or non-human primates in past decades [1][2][3]. However, there are few investigations of movement decoding for restoring hand function, which is a great challenge with a higher degrees of freedom (DoFs). The hand is a marvelous example of how a complex biomechanism can be implemented, with effective combinations of mechanisms, sensing, actuation and cortical control system coordinated in 38 muscles and 22 DoFs [4]. With these complex architectures, recent studies have shown that how dexterous grasps are represented and transformed into motor commands in distinct brain regions. Several cortical