Background: In recent years, relation extraction from unstructured texts has become an important task in medical research. However, relation extraction requires a large amount of labeled corpus, manually annotating sequences is time consuming and expensive. Therefore, efficient and economical methods for annotating sequences are required to ensure the performance of relational extraction. Methods: This paper proposes a method of subsequence and distant supervision based active learning. The method is annotated by selecting information-rich subsequences as a sampling unit instead of the full sentences in traditional active learning. Additionally, the method saves the labeled subsequence texts and their corresponding labels in a dictionary which is continuously updated and maintained, and pre-labels the unlabeled set through text matching based on the idea of distant supervision. Finally, the method combines a BERT-CRF model for relation extraction in Chinese medical texts. Results: Experimental results test on the CMeIE dataset that it achieves the best results compared to existing methods. And the best F1 values are obtained in different sampling strategies, which are 52.65%, 52.55% and 51.37% respectively.