Lens-free on-chip digital holographic microscope (LFOCDHM) is crucial for biomedical applications like cell cycle assays, drug development, digital pathology, and high-throughput biological screening. The unit magnification configuration results in a field-of-view (FOV) containing over a hundred times more cells than a conventional 10× microscope objective, making segmentation labor-intensive and time-consuming due to complex and variable cell morphology. Although many deep learning-based cell segmentation methods exist, convolutional neural networks (CNNs) have a limited localized receptive field and are unsuitable for large FOV imaging from LFOCDHM. We propose Swin Transformer U-Net (STU-Net), a high-throughput live cell analysis method. It uses a shift window to compute self-attention and extract features at five scales, achieving accurate cell segmentation (accuracy > 0.9743). We validated STU-Net’s robustness and generalizability with HeLa cell slides across the full FOV in vitro. This approach, capable of quantifying cell growth and proliferation from segmentation results, offers a strong foundation for drug development and biological screening.
|