Automating stochastic gradient methods with adaptive batch sizes
- đ¤ Speaker: Tom Goldstein (University of Maryland, College Park)
- đ Date & Time: Wednesday 06 September 2017, 09:50 - 10:40
- đ Venue: Seminar Room 1, Newton Institute
Abstract
This talk will address several issues related to training neural networks using stochastic gradient methods. First, we'll talk about the difficulties of training in a distributed environment, and present a new method called centralVR for boosting the scalability of training methods. Then, we'll talk about the issue of automating stochastic gradient descent, and show that learning rate selection can be simplified using “Big Batch” strategies that adaptively choose minibatch sizes.
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 1, Newton Institute
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Tom Goldstein (University of Maryland, College Park)
Wednesday 06 September 2017, 09:50-10:40