On Distributed Stochastic Gradient Descent For Nonconvex Functions In The Presence Of Byzantines

This video program is a part of the Premium package:

On Distributed Stochastic Gradient Descent For Nonconvex Functions In The Presence Of Byzantines


  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

On Distributed Stochastic Gradient Descent For Nonconvex Functions In The Presence Of Byzantines

0 views
  • Share
Create Account or Sign In to post comments
We consider the distributed stochastic optimization problem of minimizing a nonconvex function $f$ in an adversarial setting. All the $w$ worker nodes in the network are expected to send their stochastic gradient vectors to the fusion center (or server).
We consider the distributed stochastic optimization problem of minimizing a nonconvex function $f$ in an adversarial setting. All the $w$ worker nodes in the network are expected to send their stochastic gradient vectors to the fusion center (or server).