25Assigning cell identities in dense image stacks is critical for many applications, for comparing 26 data across animals and experiment conditions, and investigating properties of specific cells. 27Conventional methods are laborious, require experience, and could introduce bias. We present 28 a generalizable framework based on Conditional Random Fields models for automatic cell 29 identification. This approach searches for optimal arrangements of labels that maximally 30 preserves prior knowledge such as geometrical relationships. The algorithm shows better 31 accuracy and more robust handling of perturbations, e.g. missing cells and position variability, 32 with both synthetic and experimental ground-truth data. The framework is generalizable across 33 strains, imaging conditions, and easily builds and utilizes active data-driven atlases, which 34 further improves accuracy. We demonstrate the utility in gene-expression pattern analysis, 35 multi-cellular calcium imaging, and whole-brain imaging experiments. Thus, our framework is 36 highly valuable to a wide variety of annotation scenarios including in zebrafish, Drosophila, 37 hydra, and mouse brains. 38 39 6 define cell specific features (unary potentials) and co-dependent features (pairwise potentials) 104 in the model. The basic model uses several pairwise relationship features for all pairs of cells, 105including binary positional relationships, angular relationship, and the Gromov-Wasserstein 106 discrepancy between cells in the image and an atlas. By encoding these features among all pairs 107 of cells, our fully-connected CRF model accounts for label dependencies between each cell pair 108 to maximize accuracy. Third, identities are automatically predicted for all neurons iteratively, 109 taking into account neurons missing in the image stack ( Supplementary Note 1.4). Duplicate 110 assignments are handled by calculating a label-consistency score for each neuron, removing 111