“…boguslaw.obara@durham.ac.uk arXiv:1904.05217v1 [q-bio.QM] 10 Apr 2019 variations upon thresholding-based segmentation, which are limited in performance due to intensity inhomogeneity and nuclei/cell clustering [5]; -H-minima transforms [8]; voting-based techniques [18], which both show good results but are sensitive to parameters; gradient vector flow tracking and thresholding [9,10], such PDE-based methods require strong stopping and reinitialisation criteria, set in advance, to achieve smooth curves for tracking; using Laplacian of Gaussian filters [17], which is low computational complexity but struggles with variation in size, shape and rotation of objects within an image; and graph-cut optimisation approaches [12], which requires the finding of initial seed points for each nucleus. convolutional neural networks, which generally require copious amounts of manually labelled training data, and struggle to separate overlapping objects [7] In our experience, nuclei size (scale) is an important contributor to 1. whether or not an algorithm is successful 2. how robust an algorithm is to image variation and 3. to the running time of an algorithm. We have also found that many excellent algorithms exist for detecting small blobs, on the scale of a few pixels diameter, that fail or are less reliable for medium or large blobs, i.e.…”