QR Factorization Calculator

Free QR factorization calculator with step-by-step solutions. Decompose matrices using Gram-Schmidt, Householder, or Givens methods. See Q and R matrices instantly.

Step-by-step decomposition

Show approximate fractions where possible

Reduced Q (m×n) for rectangular matrices

Householder Reflections
A = Q × R
|det(R)| = 3
Rank = 3
3×3

Q (Orthogonal Matrix)

Full Q 3×3 orthogonal matrix, QᵀQ = I

-0.1231
0.9045
0.4082
-0.4924
0.3015
-0.8165
-0.8616
-0.3015
0.4082

R (Upper Triangular)

Full R 3×3 upper triangular, zeros below diagonal

-8.124
-9.601
-11.94
0
0.9045
1.508
0
0
0.4082

Verification

Numerical checks to confirm the decomposition is correct

QᵀQ ≈ I check

Frobenius norm of QᵀQ − I = 1.493e-10 ✓ Passed

QR ≈ A check

Relative error ∥QR − A∥ / ∥A∥ = 2.695e-10 ✓ Passed

Step-by-Step Solution

See how the decomposition is computed

1.Starting Householder QR decomposition.
2.Column 1: x = [0.1, 0.4, 0.7], ||x|| = 0.8124038405, α = -0.8124038405. Householder vector u = v / ||v||.
3.Column 2: x = [-0.008596557002, -0.09004397475], ||x|| = 0.09045340337, α = 0.09045340337. Householder vector u = v / ||v||.
4.Q and R computed via Householder reflections. Verification: QᵀQ ≈ I, QR ≈ A.

What Is QR Factorization?

Decomposing a matrix into orthogonal and triangular factors

QR factorization (also called QR decomposition) writes a matrix A as the product A = QR, where Q is an orthogonal matrix (QᵀQ = I) and R is an upper triangular matrix. It is a cornerstone algorithm in numerical linear algebra.

The columns of Q form an orthonormal basis for the column space of A , meaning they are perpendicular unit vectors. This property makes QR ideal for solving least squares problems and computing eigenvalues stably.

Core Identity
A = Q × R
Q = orthogonal (QᵀQ = I)
R = upper triangular

Q (Orthogonal)

Orthonormal columns: qᵢ·qⱼ = 0 for i≠j, ||qᵢ|| = 1

R (Upper Triangular)

Non-zero only on and above diagonal

Why QR over LU? QR factorization is more numerically stable and handles rectangular matrices. For least squares problems, QR avoids the explicit formation of AᵀA (which squares the condition number), making it the method of choice.

QR Decomposition Methods Compared

Gram-Schmidt, Householder, and Givens , when to use each

Gram-Schmidt

qⱼ = aⱼ − Σ(qᵢ·aⱼ)qᵢ, then normalize

Orthonormalizes columns one at a time. Intuitive and matches textbook presentations, but can lose orthogonality for ill-conditioned matrices.

Householder

H = I − 2vvᵀ/vᵀv (reflection matrix)

Uses mirror-reflection matrices to zero entire columns. The standard production method , backward stable and most numerically robust. ★ Recommended.

Givens

G = [[c, s], [−s, c]] (2D rotation)

Zeroes elements one at a time using rotation matrices. Best for sparse matrices and parallel processing. More operations but finer control.

PropertyGram-SchmidtHouseholderGivens
Q matrixColumn-by-columnProduct of H matricesProduct of G matrices
StabilityFragileBackward stable ★Stable
SparsityNot preservedNot preservedWell preserved
Cost (m≥n)2mn²2mn² − ⅔n³3mn² − n³
ParallelismSequentialGoodExcellent
PedagogicalExcellentModerateModerate

Worked Examples

Step-by-step calculations with numerical outputs

Example 1 , 3×3 Matrix (All Methods)

A = [[1, 2, 3], [4, 5, 6], [7, 8, 10]]
Gram-Schmidt: Q has orthonormal columns, R is upper triangular
Householder: 2 reflections zero columns 1–2 below diagonal
Givens: 3 rotations (G₂₁, G₃₁, G₃₂) zero sub-diagonal entries
Output: QᵀQ ≈ I with error < 10⁻¹⁰, QR ≈ A with error < 10⁻¹⁰

Matrix

3×3

input

|det(R)|

3

det

Rank

3

full rank

QᵀQ Error

~10⁻¹⁵

orthogonal

Example 2 , 4×2 Rectangular Matrix

A = [[1, 2], [3, 4], [5, 6], [7, 8]] (overdetermined system)
Full QR: Q is 4×4, R is 4×2 with zeros below row 2
Economy QR: Q is 4×2, R is 2×2 , more efficient, same A = QR
Rank = 2 (linearly independent columns)

Formulas & Algorithms

How each method computes Q and R

Classical Gram-Schmidt
rᵢⱼ = qᵢ · aⱼ (for i < j), vⱼ = aⱼ − Σᵢ₌₁ʲ⁻¹ rᵢⱼ·qᵢ
rⱼⱼ = ||vⱼ||, qⱼ = vⱼ / rⱼⱼ

Compute one column at a time: project out previous components, normalize. Intuitive but susceptible to loss of orthogonality.

Householder Reflections
v = x − αe₁ where α = −sign(x₁)·||x||
H = I − 2vvᵀ/(vᵀv), Q = H₁H₂...Hₙ

Each Householder matrix H reflects the current sub-column onto the first coordinate axis, zeroing everything below in one operation. Backward stable.

Givens Rotations
r = √(a² + b²), c = a/r, s = −b/r
G = [[c, −s], [s, c]], applied to rows (j, i)

Each Givens rotation zeros a single element using a 2D rotation. Builds Q incrementally. Preserves sparsity patterns and is easy to parallelize.

Tips & Best Practices

When to use each method and how to avoid pitfalls

Which method should I choose?

Householder is the recommended default , it is numerically stable and fast. Use Gram-Schmidt when learning the concept (matches textbooks). Use Givens when working with sparse matrices or implementing in parallel.

How to interpret the verification checks

The Frobenius norm ||QᵀQ − I|| should be near zero (≪10⁻⁸). The relative error ||QR − A|| / ||A|| should also be near zero. If both pass, your decomposition is correct regardless of sign conventions in Q columns.

Economy QR for rectangular matrices

When rows > columns, enable Economy QR to get a compact Q (same size as A) and square R. This is what most numerical libraries (LAPACK, NumPy, MATLAB) do by default.

When Gram-Schmidt fails

For nearly linearly dependent columns, Classical Gram-Schmidt can produce Q matrices that are not orthogonal. If the orthogonality check shows a large error, switch to Householder , it handles these cases reliably.

Common Mistakes

Pitfalls to avoid when computing or interpreting QR factorization

Ignoring orthogonality drift

Classical Gram-Schmidt can produce Q that is far from orthogonal for ill-conditioned matrices. Always check QᵀQ ≈ I , if it fails, switch to Householder.

Assuming Q is unique

QR factorization is not unique. Columns of Q can have sign flips, and R diagonal signs compensate. Comparing two valid QR decompositions may show different numbers.

Confusing full vs economy QR

For an m×n rectangular matrix (m > n), full Q is m×m but only n columns are 'active.' Economy QR gives the compact m×n Q , most real-world applications use this.

Using QR for square linear systems instead of LU

QR costs ~2× more than LU for square matrices. If you just need to solve Ax = b for a well-conditioned square A, LU factorization is faster.

Applications of QR Factorization

Where QR decomposition is used in science and engineering

Least Squares

Solve min||Ax − b|| via R₁x = Qᵀb. No explicit AᵀA , avoids squaring condition number.

QR Algorithm

Iterate Aₖ = QₖRₖ, Aₖ₊₁ = RₖQₖ to converge to eigenvalues. The foundation of LAPACK's eigenvalue routines.

Linear Systems

For Ax = b, solve Rx = Qᵀb. More stable than LU for ill-conditioned systems at ~2× the cost.

ML & Statistics

Orthogonal initialization of neural network weights. QR decomposition for PCA and dimensionality reduction.

Frequently Asked Questions

Common questions and detailed answers

Embed QR Factorization Calculator

Add this calculator to your website or blog for free.

Last updated May 5, 2026