On Distributed Stochastic Gradient Algorithms For Global Optimization

The paper considers the problem of network-based computation of global minima in smooth nonconvex optimization problems. It is known that distributed gradient-descent-type algorithms can achieve convergence to the set of global minima by adding slowly dec
  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

Videos in this product