%A YANG Xiuyuan, PENG Tao, YANG Liang, LIN Hongfei %T Adaptive multi-domain sentiment analysis based on knowledge distillation %0 Journal Article %D 2021 %J Journal of Shandong University(Engineering Science) %R 10.6040/j.issn.1672-3961.0.2020.249 %P 15-21 %V 51 %N 3 %U {http://gxbwk.njournal.sdu.edu.cn/CN/abstract/article_2033.shtml} %8 2021-06-20 %X An adaptive multi-domain knowledge distillation framework was proposed, which effectively accelerated reasoning and reduced model parameters while ensuring model performance. The knowledge distillation method was used to study sentiment analysis problems. When performing knowledge distillation for each specific field, model distillation involved word embedding layer distillation, coding layer distillation(attention distillation, hidden state distillation), output prediction layer distillation and other aspects of distillation, in order to learn all aspects knowledge from the specific field teacher model. Selectively learning the importance of the teacher model corresponding to different fields to the data was proposed, which further improved the accuracy of the prediction results. The experimental results on multiple public datasets showed that after single-domain knowledge distillation increased the model accuracy by an average of 2.39%, while multi-domain knowledge distillation increased the model accuracy by an average of 0.5%. Compared with the knowledge distillation of a single domain, this framework enhanced the generalization ability of the student model and improved the performance.