Unsupervised segmentation and classification of cervical cell images

Creative Commons License

Gençtav A., Aksoy S., Onder S.

PATTERN RECOGNITION, vol.45, no.12, pp.4151-4168, 2012 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 45 Issue: 12
  • Publication Date: 2012
  • Doi Number: 10.1016/j.patcog.2012.05.006
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.4151-4168
  • Keywords: Pap smear test, Cell grading, Automatic thresholding, Hierarchical segmentation, Multi-scale segmentation, Hierarchical clustering, Ranking, Optimal leaf ordering, CYTOPLAST CONTOUR DETECTOR, NUCLEUS SEGMENTATION, ALGORITHM
  • TED University Affiliated: No


The Pap smear test is a manual screening procedure that is used to detect precancerous changes in cervical cells based on color and shape properties of their nuclei and cytoplasms. Automating this procedure is still an open problem due to the complexities of cell structures. In this paper, we propose an unsupervised approach for the segmentation and classification of cervical cells. The segmentation process involves automatic thresholding to separate the cell regions from the background, a multi-scale hierarchical segmentation algorithm to partition these regions based on homogeneity and circularity, and a binary classifier to finalize the separation of nuclei from cytoplasm within the cell regions. Classification is posed as a grouping problem by ranking the cells based on their feature characteristics modeling abnormality degrees. The proposed procedure constructs a tree using hierarchical clustering, and then arranges the cells in a linear order by using an optimal leaf ordering algorithm that maximizes the similarity of adjacent leaves without any requirement for training examples or parameter adjustment. Performance evaluation using two data sets show the effectiveness of the proposed approach in images having inconsistent staining, poor contrast, and overlapping cells. (C) 2012 Elsevier Ltd. All rights reserved.