Demonstration of the Gradient Descent Algorithm
In the animation package, there is a function named
grad.desc()
. It provides a visual illustration for the process of minimizing a real-valued
function through the Gradient Descent Algorithm.
The two examples below show you how to use the grad.desc()
function.
A simple function
The default objective function in grad.desc()
is . The arrows will take
you to the minima step by step:
library(animation)
par(mar = c(4, 4, 2, 0.1))
grad.desc()
When the algorithm fails
This example shows how the gradient descent algorithm will fail with a too large step length.
To find a local minimum of a bivariate objective function:
ani.options(nmax = 70)
par(mar = c(4, 4, 2, 0.1))
f2 = function(x, y) sin(1/2 * x^2 - 1/4 * y^2 + 3) * cos(2 * x + 1 -
exp(y))
grad.desc(f2, c(-2, -2, 2, 2), c(-1, 0.5), gamma = 0.3, tol = 1e-04)
## Warning: Maximum number of iterations reached!
Apparently the arrows get lost eventually. You can replace gamma=0.3
with a smaller value and
retry the function.
References
This article was reproduced from vistat.
Published
24 March 2013