Knowledge Distillation And Random Erasing Data Augmentation For Text-Dependent Speaker Verification

This video program is a part of the Premium package:

Knowledge Distillation And Random Erasing Data Augmentation For Text-Dependent Speaker Verification


  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

Knowledge Distillation And Random Erasing Data Augmentation For Text-Dependent Speaker Verification

0 views
  • Share
Create Account or Sign In to post comments
This paper explores the Knowledge Distillation (KD) approach and a data augmentation technique to improve the generalization ability and robustness of text-dependent speaker verification (SV) systems. The KD method consists of two neural networks, known a
This paper explores the Knowledge Distillation (KD) approach and a data augmentation technique to improve the generalization ability and robustness of text-dependent speaker verification (SV) systems. The KD method consists of two neural networks, known a