By Peter Harrington
Gradient Ascent uses the whole dataset on each update. This is fine with 100 examples but, with billions of data points containing thousands of features, it is unnecessarily expensive in terms of computational resources. This article, based on chapter 5 of Machine Learning in Action, discusses Stochastic Gradient Ascent and how to make modifications that will yield better results.

Stochastic Gradient Ascent (PDF)