Minfeng Zhu
Open Menu
Close Menu
Home
Papers
Knowledge Distillation
Self-Distillation Bridges Distribution Gap in Language Model Fine-Tuning
Feb 21, 2024
SHOT-VAE: Semi-supervised Deep Generative Models With Label-aware ELBO Approximations
Jul 1, 2021
KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation
Jul 1, 2021