Active self-semi-supervised learning for few labeled samples

Published in Neurocomputing, 2024

Key observations:

  1. Weight initialization struggles to effectively transfer valuable information obtained from self-supervised training to the semi-supervised model
  2. Label propagation on pre-trained features to construct prior pseudo-labels, serving as an effective intermediary to transfer information from self-supervised learning to semi-supervised models.
  3. Adapt active learning to improve the accuracy of prior pseudo-labels.

Alt text for image

Recommended citation: Ziting Wen, Oscar Pizarro, and Stefan Williams. "Active self-semi-supervised learning for few labeled samples." Neurocomputing (2024): 128772.
Download Paper