Skip to main content Skip to main navigation

Publication

Randout-KD: Finetuning Foundation Models for Text Classification via Random Noise and Knowledge Distillation

Pervaiz Iqbal Khan; Andreas Dengel; Sheraz Ahmed
In: ICAART. International Conference on Agents and Artificial Intelligence (ICAART-2023), February 22-24, Lisbon, Portugal, SCITEPRESS, 2023.

Abstract

Finetuning foundation models effectively on downstream tasks is ongoing research. In this paper, we present a finetuning method “Randout-KD” that enhances the performance of a student model for text classification. We specifically propose a noise-injecting method in the representations of the transformer model during its finetuning that works as regularization. Moreover, we integrate the knowledge distillation and noise injection methods and show that combining these approaches boosts the baseline model performance. We evaluate the proposed method on two datasets namely “CODA-19” and “RHMD” using PubMedBERT and RoBERTaLarge as teacher models, and data2vec as a student model. Results show that the proposed approach improves the accuracy up to 1.2% compared to the baseline methods