In recent years, Transformers have demonstrated significant potential in segmentation tasks. Self-attention are generally believed to be crucial for improving the performance. However, these excellent results are based on large-scale imaging datasets, and there is not suitable for low data regimes such as lesion segmentation task. In this paper, we proposes adopting a few-shot data-friendly pre-trained convolution kernel to replace self-attention as the feature mixer. Specifically, we have designed a new information transmission layer which extracts feature information through convolution with initialization information parameters, and performs feature mapping through MLP. To verify its effectiveness, we conducted experiments on two low data sets, CVC-ClinicDB and 2018 Data Science Bowl challenge dataset. Surprisingly, we proposed scheme achieves a Dice Similarity Coefficient (DSC) of 93.17%, which is 19.18% and 12.64% higher than the two baselines TransU-Net and SegFormer with self-attention, respectively. Therefore, PCFormer is encouraging and can be relied upon for few-shot lesion segmentation.