In mathematics, particularly linear algebra and numerical analysis, the Gram–Schmidt process or Gram-Schmidt algorithm is a way of finding a set of two or more vectors that are perpendicular to each other.By technical definition, it is a method of constructing an orthonormal basis from a set of vectors in an inner product space, most commonly the Euclidean spaceThe Gram–Schmidt process takes a finite, linearly independent set of vectorsThe method is named after Jørgen Pedersen Gram and Erhard Schmidt, but Pierre-Simon Laplace had been familiar with it before Gram and Schmidt.The application of the Gram–Schmidt process to the column vectors of a full column rank matrix yields the QR decomposition (it is decomposed into an orthogonal and a triangular matrix).To check that these formulas yield an orthogonal sequence, first computeThe Gram–Schmidt process also applies to a linearly independent countably infinite sequence {vi}i.The result is an orthogonal (or orthonormal) sequence {ui}i such that for natural number n: the algebraic span ofIf the Gram–Schmidt process is applied to a linearly dependent sequence, it outputs the 0 vector on theThe number of vectors output by the algorithm will then be the dimension of the space spanned by the original inputs.A variant of the Gram–Schmidt process using transfinite recursion applied to a (possibly uncountably) infinite sequence of vectorsholds, even if the starting set was linearly independent, and the span ofFurther, a parametrized version of the Gram–Schmidt process yields a (strong) deformation retraction of the general linear groupThe Gram–Schmidt process can be stabilized by a small modification; this version is sometimes referred to as modified Gram-Schmidt or MGS.Proceeding in this manner we find the full set of orthogonal vectorsIf orthonormal vectors are desired, then we normalize as we go, so that the denominators in the subtraction formulas turn into ones.The cost of this algorithm is asymptotically O(nk2) floating point operations, where n is the dimensionality of the vectors.The result of the Gram–Schmidt process may be expressed in a non-recursive formula using determinants.is a "formal" determinant, i.e. the matrix contains both scalars and vectors; the meaning of this expression is defined to be the result of a cofactor expansion along the row of vectors.The determinant formula for the Gram-Schmidt is computationally (exponentially) slower than the recursive algorithms described above; it is mainly of theoretical interest.Other orthogonalization algorithms use Householder transformations or Givens rotations.The algorithms using Householder transformations are more stable than the stabilized Gram–Schmidt process.th iteration, while orthogonalization using Householder reflections produces all the vectors only at the end.Yet another alternative is motivated by the use of Cholesky decomposition for inverting the matrix of the normal equations in linear least squares.are orthonormal and span the same subspace as the columns of the original matrixmakes the algorithm unstable, especially if the product's condition number is large.Nevertheless, this algorithm is used in practice and implemented in some software packages because of its high efficiency and simplicity.In quantum mechanics there are several orthogonalization schemes with characteristics better suited for certain applications than original Gram–Schmidt.Nevertheless, it remains a popular and effective algorithm for even the largest electronic structure calculations.The run-time analysis is similar to that of Gaussian elimination.
The modified Gram-Schmidt process being executed on three linearly independent, non-orthogonal vectors of a basis for
. Click on image for details. Modification is explained in the Numerical Stability section of this article.