Balancing Rates And Variance Via Adaptive Batch-Sizes In First-Order Stochastic Optimization

This video program is a part of the Premium package:

Balancing Rates And Variance Via Adaptive Batch-Sizes In First-Order Stochastic Optimization


  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

Balancing Rates And Variance Via Adaptive Batch-Sizes In First-Order Stochastic Optimization

0 views
  • Share
Create Account or Sign In to post comments
Stochastic gradient descent is a canonical tool for addressing stochastic optimization problems, and forms the bedrock of modern machine learning and statistics. In this work, we seek to balance the fact that attenuating step-sizes is required for exact a
Stochastic gradient descent is a canonical tool for addressing stochastic optimization problems, and forms the bedrock of modern machine learning and statistics. In this work, we seek to balance the fact that attenuating step-sizes is required for exact a