diff options
author | Stanislaw Halik <sthalik@misaki.pl> | 2018-11-12 06:42:35 +0100 |
---|---|---|
committer | Stanislaw Halik <sthalik@misaki.pl> | 2018-11-12 06:42:35 +0100 |
commit | 407b6208604d2822b1067ac64949e78a9167572b (patch) | |
tree | 8e4371deef2804e77e2fe6e17158be2536de28da /eigen/doc/TutorialLinearAlgebra.dox | |
parent | ca2e0fcdcfff03747500344e2522ff330ccafa14 (diff) |
eigen update
Diffstat (limited to 'eigen/doc/TutorialLinearAlgebra.dox')
-rw-r--r-- | eigen/doc/TutorialLinearAlgebra.dox | 28 |
1 files changed, 24 insertions, 4 deletions
diff --git a/eigen/doc/TutorialLinearAlgebra.dox b/eigen/doc/TutorialLinearAlgebra.dox index cb92cee..a727241 100644 --- a/eigen/doc/TutorialLinearAlgebra.dox +++ b/eigen/doc/TutorialLinearAlgebra.dox @@ -73,7 +73,7 @@ depending on your matrix and the trade-off you want to make: <td>ColPivHouseholderQR</td> <td>colPivHouseholderQr()</td> <td>None</td> - <td>++</td> + <td>+</td> <td>-</td> <td>+++</td> </tr> @@ -86,6 +86,14 @@ depending on your matrix and the trade-off you want to make: <td>+++</td> </tr> <tr class="alt"> + <td>CompleteOrthogonalDecomposition</td> + <td>completeOrthogonalDecomposition()</td> + <td>None</td> + <td>+</td> + <td>-</td> + <td>+++</td> + </tr> + <tr class="alt"> <td>LLT</td> <td>llt()</td> <td>Positive definite</td> @@ -102,14 +110,23 @@ depending on your matrix and the trade-off you want to make: <td>++</td> </tr> <tr class="alt"> + <td>BDCSVD</td> + <td>bdcSvd()</td> + <td>None</td> + <td>-</td> + <td>-</td> + <td>+++</td> + </tr> + <tr class="alt"> <td>JacobiSVD</td> <td>jacobiSvd()</td> <td>None</td> - <td>- -</td> + <td>-</td> <td>- - -</td> <td>+++</td> </tr> </table> +To get an overview of the true relative speed of the different decompositions, check this \link DenseDecompositionBenchmark benchmark \endlink. All of these decompositions offer a solve() method that works as in the above example. @@ -183,8 +200,11 @@ Here is an example: \section TutorialLinAlgLeastsquares Least squares solving -The most accurate method to do least squares solving is with a SVD decomposition. Eigen provides one -as the JacobiSVD class, and its solve() is doing least-squares solving. +The most accurate method to do least squares solving is with a SVD decomposition. +Eigen provides two implementations. +The recommended one is the BDCSVD class, which scale well for large problems +and automatically fall-back to the JacobiSVD class for smaller problems. +For both classes, their solve() method is doing least-squares solving. Here is an example: <table class="example"> |