Link to DOI – 10.1109/ISBI53787.2023.10230664
IEEE 20th International Symposium on Biomedical Imaging (ISBI), Cartagena, Colombia, 2023, pp. 1-5
In time-lapse fluorescence imaging, single-particle-tracking is a powerful tool to monitor the dynamics of objects of interest, and extract information about biological processes. However, tracked particles can be subject to occlusion and intermittent detectability. When these phenomena persist over a few frames, tracking algorithms tend to produce multiple tracklets for the same particle. In this work, we introduce self-supervised learning of visual features to compare tracked particles, and we exploit both visual and positional distances to robustly stitch tracklets representing the same particle. We demonstrate the performance of our stitching framework on time-lapse fluorescence sequences of Hydra Vulgaris neurons. Results show high stitching precision, and reduction of errors made by previous algorithms on the same data by a factor of two.