Gradient descent [in constraint satisfaction]
A standard method for finding a minimum in a smooth function f[x]
✖
f[x]
is to use
FixedPoint[# - a f'[#] &, x0]
✖
FixedPoint[#-af'[#]&,\!\(\*SubscriptBox[\(x\),\(0\)]\)]
If there are local minima, then which one is reached will depend on the starting point x0
✖
\!\(\*SubscriptBox[\(x\),\(0\)]\)
. It will not necessarily be the one closest to x0
✖
\!\(\*SubscriptBox[\(x\),\(0\)]\)
because of potentially complicated overshooting effects associated with the step size a. Newton's method for finding zeros of f[x]
✖
f[x]
is related and is given by
FixedPoint[# - f[#]/f'[#] &, x0]
✖
FixedPoint[#-f[#]/f'[#]&,\!\(\*SubscriptBox[\(x\),\(0\)]\)]