Already purchased this program?
Login to View
This video program is a part of the Premium package:
On Distributed Stochastic Gradient Descent For Nonconvex Functions In The Presence Of Byzantines
- IEEE MemberUS $11.00
- Society MemberUS $0.00
- IEEE Student MemberUS $11.00
- Non-IEEE MemberUS $15.00
On Distributed Stochastic Gradient Descent For Nonconvex Functions In The Presence Of Byzantines
We consider the distributed stochastic optimization problem of minimizing a nonconvex function $f$ in an adversarial setting. All the $w$ worker nodes in the network are expected to send their stochastic gradient vectors to the fusion center (or server).
We consider the distributed stochastic optimization problem of minimizing a nonconvex function $f$ in an adversarial setting. All the $w$ worker nodes in the network are expected to send their stochastic gradient vectors to the fusion center (or server).