“…In summary, our contributions are: (i) to the best of our knowledge, ours is the first work to study prompt learning for few-shot cross-domain NER; (ii) we develop a mutual information-based approach to identify important entity type-related features from the source domain; (iii) we design a two-stage scheme that generates and incorporates a prompt that is highly relevant to the source domain for each new example, effectively mitigating the gap between source and unseen domains; and (iv) experimental results show that our proposed PLTR achieves state-of-the-art performance on both in-domain and cross-domain datasets. et al, 2013;Lee et al, 2018;Yang et al, 2017;Jia et al, 2019;Jia and Zhang, 2020;Zheng et al, 2022;Hu et al, 2023;Wang et al, 2021) Ma et al, 2022;Lee et al, 2022;Das et al, 2022;Chen et al, 2022b;Dong et al, 2023;Fang et al, 2023). In particular, Das et al (2022) incorporate contrastive learning techniques with prompts to better capture label dependencies.…”