A Hybrid Text Normalization System Using Multi-Head Self-Attention For Mandarin

This video program is a part of the Premium package:

A Hybrid Text Normalization System Using Multi-Head Self-Attention For Mandarin


  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

A Hybrid Text Normalization System Using Multi-Head Self-Attention For Mandarin

0 views
  • Share
Create Account or Sign In to post comments
In this paper, we propose a hybrid text normalization system using multi-head self-attention. The system combines the advantages of a rule-based model and a neural model for text preprocessing tasks. Previous studies in Mandarin text normalization usually
In this paper, we propose a hybrid text normalization system using multi-head self-attention. The system combines the advantages of a rule-based model and a neural model for text preprocessing tasks. Previous studies in Mandarin text normalization usually