Learning Recurrent Neural Network Language Models With Context-Sensitive Label Smoothing For Automatic Speech Recognition

This video program is a part of the Premium package:

Learning Recurrent Neural Network Language Models With Context-Sensitive Label Smoothing For Automatic Speech Recognition


  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

Learning Recurrent Neural Network Language Models With Context-Sensitive Label Smoothing For Automatic Speech Recognition

0 views
  • Share
Create Account or Sign In to post comments
Recurrent neural network language models (RNNLMs) have become very successful in many natural language processing tasks. However, RNNLMs trained with a cross entropy loss function and hard output targets are prone to overfitting, which weakens the languag
Recurrent neural network language models (RNNLMs) have become very successful in many natural language processing tasks. However, RNNLMs trained with a cross entropy loss function and hard output targets are prone to overfitting, which weakens the languag