💻Project Experience

SPENCER: Self-Adaptive Model Distillation for Efficient Code Retrieval

2023.2 - 2023.8

Supervisor: Prof. Cuiyun Gao(高翠芸)

  • Propose a unified framework which combines both dual-encoder and cross-encoder of the pre-trained model in the task of code retrieval.
  • Propose an approach of the model distillation for the dual-encoder, which can greatly reduce the inference time of dual-encoder while retaining the most performance.
  • Propose a novel approach for the assistant model selection in the model distillation which can adaptively select the suitable assistant model for different pre-trained models.
  • Reduce at least 70% of model parameters while preserving more than 98% retrieval accuracy.