Multibody system: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Catskul
→‎Minimal coordinates: fix awkward non-standard term for software.
en>Yobot
m WPCleaner v1.33b - Fixed using WP:WCW (Internal link written as an external link)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
In [[numerical optimization]], the '''nonlinear conjugate gradient method''' generalizes the [[conjugate gradient method]] to [[nonlinear optimization]]. For a quadratic function <math>\displaystyle f(x)</math>:
Jayson Berryhill is how I'm called and my wife doesn't like it at all. Since he was 18 he's been operating as an information officer but he ideas on altering it. It's not a common factor but what I like performing is to climb but I don't have the time recently. I've usually loved living in Mississippi.<br><br>Here is my weblog - [http://jplusfn.gaplus.kr/xe/qna/78647 free psychic reading]
:: <math>\displaystyle f(x)=\|Ax-b\|^2</math>
The minimum of <math>f</math> is obtained when the [[gradient]] is 0:
:: <math>\nabla_x f=2 A^\top(Ax-b)=0</math>.
Whereas linear conjugate gradient seeks a solution to the linear equation
<math>\displaystyle A^\top Ax=A^\top b</math>, the nonlinear conjugate gradient method is generally
used to find the [[maxima and minima|local minimum]] of a nonlinear function
using its [[gradient]] <math>\nabla_x f</math> alone. It works when the function is approximately quadratic near the minimum, which is the case when the function is twice differentiable at the minimum.
 
Given a function <math>\displaystyle f(x)</math> of <math>N</math> variables to minimize, its gradient <math>\nabla_x f</math> indicates the direction of maximum increase.
One simply starts in the opposite ([[steepest descent]]) direction:
:: <math>\Delta x_0=-\nabla_x f (x_0) </math>
 
with an adjustable step length <math>\displaystyle \alpha</math> and performs a [[line search]] in this direction until it reaches the minimum of <math>\displaystyle f</math>:
:: <math>\displaystyle \alpha_0:= \arg \min_\alpha f(x_0+\alpha \Delta x_0)</math>,
:: <math>\displaystyle x_1=x_0+\alpha_0 \Delta x_0</math>
 
After this first iteration in the steepest direction <math>\displaystyle \Delta x_0</math>, the following steps constitute one iteration of moving along a subsequent conjugate direction <math>\displaystyle s_n</math>, where <math>\displaystyle s_0=\Delta x_0</math>:
# Calculate the steepest direction: <math>\Delta x_n=-\nabla_x f (x_n) </math>,
# Compute <math>\displaystyle \beta_n</math> according to one of the formulas below,
# Update the conjugate direction: <math>\displaystyle s_n=\Delta x_n+\beta_n s_{n-1}</math>
# Perform a line search: optimize <math>\displaystyle \alpha_n=\arg \min_{\alpha} f(x_n+\alpha s_n)</math>,
# Update the position: <math>\displaystyle x_{n+1}=x_{n}+\alpha_{n} s_{n}</math>,
With a pure quadratic function the minimum is reached within N iterations (excepting roundoff error), but a non-quadratic function will make slower progress.  Subsequent search directions lose conjugacy requiring the search direction to be reset to the steepest descent direction at least every N iterations, or sooner if progress stops. However, resetting every iteration turns the method into [[steepest descent]].  The algorithm stops when it finds the minimum, determined when no progress is made after a direction reset (i.e. in the steepest descent direction), or when some tolerance criterion is reached.
 
Within a linear approximation, the parameters <math>\displaystyle \alpha</math> and <math>\displaystyle \beta</math> are the same as in the
linear conjugate gradient method but have been obtained with line searches.
The conjugate gradient method can follow narrow ([[ill-conditioned]]) valleys where the [[steepest descent]] method slows down and follows a criss-cross pattern.
 
Three of the best known formulas for <math>\displaystyle \beta_n</math> are titled Fletcher-Reeves (FR), Polak-Ribière (PR), and Hestenes-Stiefel (HS) after their developers. They are given by the following formulas:
* Fletcher–Reeves:
:: <math>\beta_{n}^{FR} = \frac{\Delta x_n^\top \Delta x_n}
{\Delta x_{n-1}^\top \Delta x_{n-1}}
</math>
* Polak–Ribière:
:: <math>\beta_{n}^{PR} = \frac{\Delta x_n^\top (\Delta x_n-\Delta x_{n-1})}
{\Delta x_{n-1}^\top \Delta x_{n-1}}
</math>
* Hestenes-Stiefel:
:: <math>\beta_n^{HS} = -\frac{\Delta x_n^\top (\Delta x_n-\Delta x_{n-1})}
{s_{n-1}^\top (\Delta x_n-\Delta x_{n-1})}
</math>.
 
These formulas are equivalent for a quadratic function, but for nonlinear optimization the preferred formula is a matter of heuristics or taste. A popular choice is <math>\displaystyle \beta=\max\{0,\,\beta^{PR}\}</math> which provides a direction reset automatically.
 
Newton based methods - [[Newton-Raphson Algorithm]], [[Quasi-Newton methods]] (e.g., [[BFGS method]]) - tend to converge in fewer iterations, although each iteration typically requires more computation than a conjugate gradient iteration as Newton-like methods require computing the [[Hessian]] (matrix of second derivatives) in addition to the gradient.  Quasi-Newton methods also require more memory to operate (see also the limited memory [[L-BFGS]] method).
 
==External links==
* [http://www.cs.cmu.edu/~quake-papers/painless-conjugate-gradient.pdf An Introduction to the Conjugate Gradient Method Without the Agonizing Pain] by Jonathan Richard Shewchuk.
* [http://www.nrbook.com/a/bookcpdf.php Numerical Recipes in C - The Art of Scientific Computing], chapter 10, section 6: Conjugate Gradient Methods in Multidimensions; William H. Press (Editor), Saul A. Teukolsky (Editor), William T. Vetterling (Author), Brian P. Flannery (Author), Cambridge University Press; 2nd edition (1992).
 
[[Category:Optimization algorithms and methods]]
[[Category:Gradient methods]]
 
[[ru:Метод сопряжённых градиентов]]
 
{{Optimization algorithms}}

Latest revision as of 10:23, 2 October 2014

Jayson Berryhill is how I'm called and my wife doesn't like it at all. Since he was 18 he's been operating as an information officer but he ideas on altering it. It's not a common factor but what I like performing is to climb but I don't have the time recently. I've usually loved living in Mississippi.

Here is my weblog - free psychic reading