diff options
author | Stanislaw Halik <sthalik@misaki.pl> | 2017-03-25 14:17:07 +0100 |
---|---|---|
committer | Stanislaw Halik <sthalik@misaki.pl> | 2017-03-25 14:17:07 +0100 |
commit | 35f7829af10c61e33dd2e2a7a015058e11a11ea0 (patch) | |
tree | 7135010dcf8fd0a49f3020d52112709bcb883bd6 /eigen/doc/LeastSquares.dox | |
parent | 6e8724193e40a932faf9064b664b529e7301c578 (diff) |
update
Diffstat (limited to 'eigen/doc/LeastSquares.dox')
-rw-r--r-- | eigen/doc/LeastSquares.dox | 70 |
1 files changed, 70 insertions, 0 deletions
diff --git a/eigen/doc/LeastSquares.dox b/eigen/doc/LeastSquares.dox new file mode 100644 index 0000000..e2191a2 --- /dev/null +++ b/eigen/doc/LeastSquares.dox @@ -0,0 +1,70 @@ +namespace Eigen { + +/** \eigenManualPage LeastSquares Solving linear least squares systems + +This page describes how to solve linear least squares systems using %Eigen. An overdetermined system +of equations, say \a Ax = \a b, has no solutions. In this case, it makes sense to search for the +vector \a x which is closest to being a solution, in the sense that the difference \a Ax - \a b is +as small as possible. This \a x is called the least square solution (if the Euclidean norm is used). + +The three methods discussed on this page are the SVD decomposition, the QR decomposition and normal +equations. Of these, the SVD decomposition is generally the most accurate but the slowest, normal +equations is the fastest but least accurate, and the QR decomposition is in between. + +\eigenAutoToc + + +\section LeastSquaresSVD Using the SVD decomposition + +The \link JacobiSVD::solve() solve() \endlink method in the JacobiSVD class can be directly used to +solve linear squares systems. It is not enough to compute only the singular values (the default for +this class); you also need the singular vectors but the thin SVD decomposition suffices for +computing least squares solutions: + +<table class="example"> +<tr><th>Example:</th><th>Output:</th></tr> +<tr> + <td>\include TutorialLinAlgSVDSolve.cpp </td> + <td>\verbinclude TutorialLinAlgSVDSolve.out </td> +</tr> +</table> + +This is example from the page \link TutorialLinearAlgebra Linear algebra and decompositions \endlink. + + +\section LeastSquaresQR Using the QR decomposition + +The solve() method in QR decomposition classes also computes the least squares solution. There are +three QR decomposition classes: HouseholderQR (no pivoting, so fast but unstable), +ColPivHouseholderQR (column pivoting, thus a bit slower but more accurate) and FullPivHouseholderQR +(full pivoting, so slowest and most stable). Here is an example with column pivoting: + +<table class="example"> +<tr><th>Example:</th><th>Output:</th></tr> +<tr> + <td>\include LeastSquaresQR.cpp </td> + <td>\verbinclude LeastSquaresQR.out </td> +</tr> +</table> + + +\section LeastSquaresNormalEquations Using normal equations + +Finding the least squares solution of \a Ax = \a b is equivalent to solving the normal equation +<i>A</i><sup>T</sup><i>Ax</i> = <i>A</i><sup>T</sup><i>b</i>. This leads to the following code + +<table class="example"> +<tr><th>Example:</th><th>Output:</th></tr> +<tr> + <td>\include LeastSquaresNormalEquations.cpp </td> + <td>\verbinclude LeastSquaresNormalEquations.out </td> +</tr> +</table> + +If the matrix \a A is ill-conditioned, then this is not a good method, because the condition number +of <i>A</i><sup>T</sup><i>A</i> is the square of the condition number of \a A. This means that you +lose twice as many digits using normal equation than if you use the other methods. + +*/ + +}
\ No newline at end of file |