Already purchased this program?
Login to View
This video program is a part of the Premium package:
Knowledge Distillation And Random Erasing Data Augmentation For Text-Dependent Speaker Verification
- IEEE MemberUS $11.00
- Society MemberUS $0.00
- IEEE Student MemberUS $11.00
- Non-IEEE MemberUS $15.00
Knowledge Distillation And Random Erasing Data Augmentation For Text-Dependent Speaker Verification
This paper explores the Knowledge Distillation (KD) approach and a data augmentation technique to improve the generalization ability and robustness of text-dependent speaker verification (SV) systems. The KD method consists of two neural networks, known a
This paper explores the Knowledge Distillation (KD) approach and a data augmentation technique to improve the generalization ability and robustness of text-dependent speaker verification (SV) systems. The KD method consists of two neural networks, known a