Orthogonal Matrix
An orthogonal matrix is a matrix whose transpose is equal to the inverse of the matrix. Let us recall what is the transpose of a matrix. If we write either the rows of a matrix as columns (or) the columns of the matrix or rows, the resultant matrix obtained is called the transpose of the matrix. We already know that if the transpose of the matrix is equal to the original matrix, then it is a symmetric matrix. But if the transpose of the matrix is equal to the inverse of the original matrix, then it is called an orthogonal matrix.
But why the name orthogonal for it? Let us look into the definition of the orthogonal matrix along with its properties, determinant, inverse, and few solved examples.
What is an Orthogonal Matrix?
An orthogonal matrix is a square matrix A if and only its transpose is as same as its inverse. i.e., A^{T} = A^{1}, where A^{T} is the transpose of A and A^{1} is the inverse of A. From this definition, we can derive another definition of an orthogonal matrix. Let us see how.
A^{T} = A^{1}
Premultiply by A on both sides,
AA^{T} = AA^{1},
We know that AA^{1 }= I, where I is an identity matrix (of the same order as A).
Thus, AA^{T} = I.
Similarly, we can prove A^{T}A = I.
From the above two equations, AA^{T} = A^{T}A = I.
Thus, there are two definitions of an orthogonal matrix which are mentioned below.
Orthogonal Matrix Definition
A square matrix 'A' is said to be "orthogonal" if
 A^{T} = A^{1} (or)
 AA^{T} = A^{T}A = I
Orthogonal Matrix Example
Let us consider a matrix A = \(\left[\begin{array}{cc}\cos x & \sin x \\ \\ \sin x & \cos x\end{array}\right]\). Its transpose is, A^{T} = \(\left[\begin{array}{cc}\cos x & \sin x \\ \\ \sin x & \cos x\end{array}\right]\). We will find the product of these two matrices.
\(\begin{align} &A A^{T}\\ &=\left[\begin{array}{cc}\cos x & \sin x \\ \\ \sin x & \cos x\end{array}\right]\left[\begin{array}{cc}\cos x & \sin x \\ \\ \sin x & \cos x\end{array}\right] \\ &=\left[\begin{array}{cc}\cos ^{2}+\sin ^{2} x & \cos x \sin x+\sin x \cos x \\ \\ \sin x \cos x+\cos x \sin x & \sin ^{2} x+\cos ^{2} x\end{array}\right] \\ &=\left[\begin{array}{ll}1 & 0 \\ \\ 0 & 1\end{array}\right] \\ &=I \end{align}\)
Similarly, we can prove A^{T}A = I. Therefore, A is an orthogonal matrix example of order 2x2.
Determinant of Orthogonal Matrix
The determinant of an orthogonal matrix is +1 or 1. Let us prove the same here. Consider an orthogonal matrix A. Then by the definition:
AA^{T} = I
Taking determinants on both sides,
det(AA^{T}) = det(I)
We know that the determinant of an identity matrix is 1. Also, for any two matrices A and B, det(AB) = det A · det B. So
det(A) · det(A^{T}) = 1
We know that det(A) = det(A^{T}). So
det(A) · det(A) = 1
[det(A)]^{2} = 1
det(A) = ±1.
Inverse of Orthogonal Matrix
By the definition of an orthogonal matrix, for any orthogonal matrix A, A^{1} = A^{T}. We can prove this by using the other definition as well which is
AA^{T} = A^{T}A = I ... (1)
We know that two matrices A and B are inverses of each other if and only if
AB = BA = I ... (2)
From (1) and (2), it is clear that B = A^{T}. Since B is the inverse of A, B = A^{T} is equivalent to A^{1} = A^{T}.
Thus, the inverse of an orthogonal matrix is nothing but its transpose.
Orthogonal Matrix in Linear Algebra
Why the name "orthogonal matrix" for it? Let us recall the meaning of "orthogonal" in linear algebra. "Orthogonal" means "perpendicular". Two vectors are said to be orthogonal to each other if and only their dot product is zero. In an orthogonal matrix, every two rows and every two columns are orthogonal and the length of every row (vector) or column (vector) is 1. Let us examine this through an example. Consider an orthogonal matrix A = \(\left(\begin{array}{ccc}1 / 3 & 2 / 3 & 2 / 3 \\ 2 / 3 & 2 / 3 & 1 / 3 \\ 2 / 3 & 1 / 3 & 2 / 3\end{array}\right)\) (check whether AA^{T} = A^{T}A = I). We know that two vectors are orthogonal if their dot product is 0. Let us find the dot product of the first two rows.
(1/3, 2/3, 2/3) · (2/3, 2/3, 1/3)
= 2/9 + 4/9  2/9
= 0
Thus, the first two rows are orthogonal. Similarly, you can check the dot product of every two rows and every two columns. You will get each dot product to be zero.
Also, let us find the magnitude (length) of the first row.
√[(1/3)^{2}+(2/3)^{2}+(2/3)^{2}] = √1 = 1
Similarly, you can find the length of every row/column to be 1. So the easy of proving a matrix to be an orthogonal matrix is
 to prove the dot product of every two rows and every two columns is 0 and
 to prove the magnitude of every row and every column to be 1.
Properties of Orthogonal Matrix
Here are the properties of an orthogonal matrix (A) based upon its definition.
 Transpose and Inverse are equal. i.e., A^{1} = A^{T}.
 The product of A and its transpose is an identity matrix. i.e., AA^{T} = A^{T}A = I.
 Determinant is det(A) = ±1. Thus, an orthogonal matrix is always nonsingular (as its determinant is NOT 0).
 A diagonal matrix with elements to be 1 or 1 is always orthogonal.
Example: \(\left[\begin{array}{rrr}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{array}\right]\) is orthogonal.  A^{T} is also orthogonal. Since A^{1} = A^{T}, A^{1} is also orthogonal.
 The eigenvalues of A are ±1 and the eigenvectors are orthogonal.
 An identity matrix (I) is orthogonal as I · I = I · I = I.
Orthogonal Matrix Applications
Here are the uses/applications of the orthogonal matrix.
 Orthogonal matrices are used in multichannel signal processing.
 An orthogonal matrix is used in multivariate time series analysis.
 They are used in many algorithms in linear algebra.
 They are used in QR decomposition.
Important Notes on Orthogonal Matrix:
 A square matrix is orthogonal, if its inverse is equal to its transpose.
 If A is orthogonal, then A and A^{T} are inverses of each other.
 The determinant of an orthogonal matrix is either 1 or 1.
 The dot product of any two rows/columns of an orthogonal matrix is always 0.
 Any row/column of an orthogonal matrix is a unit vector.
☛Related Topics:
Check out the topics that you may find interesting while reading about the orthogonal matrix.
Examples of Orthogonal Matrix

Example 1: Prove that A = \(\dfrac{1}{7}\left[\begin{array}{ccc}3 & 2 & 6 \\ 6 & 3 & 2 \\ 2 & 6 & 3\end{array}\right]\) is an orthogonal matrix of order 3 and hence find its inverse.
Solution:
Let us find the transpose of the given matrix.
A^{T} = \(\dfrac{1}{7}\left[\begin{array}{ccc}3 & 6 & 2 \\ 2 & 3 & 6 \\ 6 & 2 & 3\end{array}\right]\).
Now, AA^{T} = \(\dfrac{1}{7}\left[\begin{array}{ccc}3 & 2 & 6 \\ 6 & 3 & 2 \\ 2 & 6 & 3\end{array}\right]\) · \(\dfrac{1}{7}\left[\begin{array}{ccc}3 & 6 & 2 \\ 2 & 3 & 6 \\ 6 & 2 & 3\end{array}\right]\)
= \(\dfrac{1}{49} \left[\begin{array}{ccc}49 & 0 & 0 \\ 0 & 49 & 0 \\ 0 & 0 & 49\end{array}\right]\)
= \(\left[\begin{array}{rrr}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{array}\right]\)
= I
In the same way, we can prove that A^{T} A = I.
Thus, A is orthogonal.
Then, A^{1} = A^{T},
A^{1} = \(\dfrac{1}{7}\left[\begin{array}{ccc}3 & 6 & 2 \\ 2 & 3 & 6 \\ 6 & 2 & 3\end{array}\right]\).
Answer: We proved A to be orthogonal of order 3x3 and A^{1} = \(\dfrac{1}{7}\left[\begin{array}{ccc}3 & 6 & 2 \\ 2 & 3 & 6 \\ 6 & 2 & 3\end{array}\right]\).

Example 2: Is every orthogonal matrix symmetric? Justify with an example.
Solution:
A matrix is 'A' said to be symmetric A = A^{T}.
Let us consider a matrix A = \(\left(\begin{array}{ccc}1 / 3 & 2 / 3 & 2 / 3 \\ 2 / 3 & 2 / 3 & 1 / 3 \\ 2 / 3 & 1 / 3 & 2 / 3\end{array}\right)\) which is orthogonal. (You can prove AA^{T} = A^{T}A = I).
Now, A^{T }= \(\left(\begin{array}{ccc}1 / 3 & 2 / 3 & 2 / 3 \\ 2 / 3 & 2 / 3 & 1 / 3 \\ 2 / 3 & 1 / 3 & 2 / 3\end{array}\right)\) (You can prove AA^{T} = A^{T}A = I).
Clearly, A ≠ A^{T}.
Answer: An orthogonal matrix is NOT symmetric always.

Example 3: If A = \(\left[\begin{array}{llll}0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0\end{array}\right]\) is a orthogonal, find (AA^{T})^{1}.
Solution:
Since A is an orthogonal matrix, AA^{T} = I.
Applying inverse on both sides,
(AA^{T})^{1} = I^{1}.
We know that the inverse of an identity matrix is itself. So I^{1} = I. Substitute this in the above equation,
(AA^{T})^{1} = I
Answer: (AA^{T})^{1} = I, which is an identity matrix.
FAQs on Orthogonal Matrix
What is the Definition of Orthogonal Matrix?
A square matrix 'A' is said to be an orthogonal matrix if its inverse is equal to its transpose. i.e., A^{1} = A^{T}. Alternatively, a matrix A is orthogonal if and only if AA^{T} = A^{T}A = I, where I is the identity matrix.
How to Find Orthogonal Matrix?
We can check whether a square matrix A is orthogonal in multiple ways.
 Check whether A^{1} = A^{T }(OR)
 Check whether AA^{T} = A^{T}A = I (OR)
 Check whether every two rows and every two columns are perpendicular (by checking whether the dot product is 0) and also check whether the magnitude of each row and each column is 1.
Why the Name Orthogonal Matrix?
In an orthogonal matrix, every two rows and every two columns are orthogonal (i.e., their dot product is 0). Also, the magnitude of every row and every column is 1. i.e., this matrix is made up of orthonormal vectors. Thus, the name of the orthogonal matrix.
Is Every Diagonal Matrix an Orthogonal Matrix?
No, every diagonal matrix is NOT orthogonal. A diagonal matrix is orthogonal only if each of its principal diagonal elements is equal to either 1 or 1.
Is an Identity Matrix an Orthogonal Matrix?
Yes, an identity matrix is an orthogonal matrix as its inverse is equal to its transpose (or) the product of the identity matrix and its transpose is equal to the identity matrix.
Is an Orthogonal Matrix Never Symmetric?
Yes, an orthogonal matrix is never symmetric. For example, \(\left[\begin{array}{cc}\cos x & \sin x \\ \\ \sin x & \cos x\end{array}\right]\) is orthogonal but NOT symmetric.
What is the Orthogonal Matrix Determinant?
The determinant of an orthogonal matrix is either +1 or 1. For detailed proof, you can see the "Determinant of Orthogonal Matrix" section of this page.
What is the Inverse of an Orthogonal Matrix?
By the definition of an orthogonal matrix, its inverse is equal to its transpose. i.e., for any orthogonal matrix A, A^{1} = A^{T}.
Is an Orthogonal Matrix NonSingular Always?
Yes, an orthogonal matrix is nonsingular always as its determinant is equal to ±1. For more information, you can visit the "Determinant of Orthogonal Matrix" section of this page.
visual curriculum