Gradient descent [in constraint satisfaction]

A standard method for finding a minimum in a smooth function f[x] is to use

FixedPoint[(# - a f'[#])&, x_{0}]

If there are local minima, then which one is reached will depend on the starting point x_{0}. It will not necessarily be the one closest to x_{0} because of potentially complicated overshooting effects associated with the step size a. Newton's method for finding zeros of f[x] is related and is given by

FixedPoint[(# - f[#]/f'[#])&, x_{0}]