In the verge of coronavirus pandemic, we are providing FREE access to our entire Online Curriculum to ensure Learning Doesn't STOP!

Terminology and Properties of Matrices

Go back to  'Determinants and Matrices'

Before we proceed, lets get acquanted with some terminology.

\((1) \quad Zero\; matrix\qquad:\)

All elements of the matrix are zero.

\((2) \quad\; Row \;matrix\qquad:\)

Also called a row vector, such a matrix has only row.

Eg. \([1\quad-1\quad2\qquad4]\)

\((3) \quad Column \; matrix\qquad:\)

Also called a column vector, such a matrix has only one column.

\(Eg.\quad\left[ \begin{align}  \; \ \ 1 \\ \; \ \ 3 \\  \; -1 \\ \end{align} \right]\)

\((4) \quad Diagonal \;matrix\qquad:\)

All non-diagonal elements are zero.

\(\text{Eg.}\left[ \ \begin{matrix}   1 & 0 & 0  \\   0 & -1 & 0  \\   0 & 0 & 2  \\
\end{matrix}\  \right]\)

\((5) \quad Upper / lower \;triangular\; matrix\qquad:\)

All elements below / above the diagonal are zero.

\(\begin{align}  & \text{Eg.}\,\,\,\left[ \ \begin{matrix}   1 & 2 & -1  \\   0 & 3 & \ \ 4  \\   0 & 0 & \ \ 1  \\
\end{matrix}\  \right]\left[ \ \begin{matrix}   1 & 0 & 0  \\   -2 & 4 & \ \ 0  \\   -1 & 7 & \ \ 1  \\\end{matrix}\  \right] \\& \qquad\qquad U-T\ \qquad\qquad L-T \\ \end{align}\)

\((6) \quad Singular\; matrix \qquad:\)

A matrix where determinant is zero.

\((7) \quad Transpose  \qquad:\)

For a matrix A, the transpose of A, which we represent as AT, is obtained by interchanging the rows and columns of A. Example,

\(A=\left[ \ \begin{matrix}   2 & 4  \\   1 & 1  \\   3 & 7  \\\end{matrix}\  \right]\quad\Rightarrow\qquad {{A}^{T}}=\left[ \ \begin{matrix}   2 & 1 & 3  \\   4 & 1 & 7  \\\end{matrix}\  \right]\)

Note that \({{\left( {{A}^{T}} \right)}^{T}}=A\)

Example - 24

For two matrices A and B whose product is defined, prove that \({{\left( AB \right)}^{T}}={{B}^{T}}{{A}^{T}}.\)

Solution: Let A be of order \(m\times n\) and B of order \(n\times p.\) Then AB is of order \(m\times p.\) Now \({{A}^{T}}\) is of order \(n\times m\ \text{and}\ {{B}^{T}}\) is of order \(p\times n.\) So \({{B}^{T}}{{A}^{T}}\) is of order \(p\times m\) which is the same as the order of \({{\left( AB \right)}^{T}}.\)

Let us understand this using a concrete example:

\[\begin{align}A&=\left[ \begin{matrix}  a & b \\  c & d \\\end{matrix} \right]B=\left[ \ \begin{matrix}  p & q & r \\   x & y & z \\\end{matrix}\ \right]\\\\\Rightarrow \left( AB \right)&=\left[ \ \begin{matrix}  ap+bx & aq+by & ar+bz \\  cp+dx & cq+dy & cr+dz \\\end{matrix}\ \right]\\\\\Rightarrow {{\left( AB \right)}^{T}}&=\left[ \ \begin{matrix}  ap+bx & cp+dx \\  \begin{gathered} aq+by \\  ar+bz \\ \end{gathered} & \begin{gathered}  cq+dy \\  cr+dz \\ \end{gathered} \\\end{matrix}\ \right]\end{align}\]

Also,

\[{{B}^{T}}{{A}^{T}}=\left[ \begin{matrix}   p & x  \\   q & y  \\   r & z  \\\end{matrix} \right]\ \ \left[ \begin{matrix}   a & c  \\   b & d  \\\end{matrix} \right]\ \ =\left[ \begin{matrix}   ap+bx & cp+dx  \\   \begin{gathered} aq+by \\   ar+bz \\ \end{gathered} & \begin{gathered}  cq+dy \\   cr+dz \\ \end{gathered}  \\\end{matrix} \right]\]

Focus on a particular term, say \(ap+bx,\) and note how it is generated at the position in both \({{\left( AB \right)}^{T}}\ \text{and}\ {{B}^{T}}{{A}^{T}}.\)

Now let us generalize this.

Consider the term in \({{\left( AB \right)}^{T}}\) at the position \(\left( i,j \right)\) say t. In AB, this same term is at the position \(\left( j.\ i \right).\) So, t is generated from \({{R}_{j}}\ \text{in}\ A\ \text{and}\ {{C}_{i}}\ \text{in}\ B\):

\[t=\sum\limits_{k=1}^{n}{{{a}_{jk}}{{b}_{ki}}}\]

Let us now consider the term at the position \(\left( i,j \right)\ \ \text{in}\ \ {{B}^{T}}{{A}^{T}}.\) This will be generated from \({{R}_{i}}\ \text{in}\ {{B}^{T}}\ {{C}_{j}}\ \text{in}\ {{A}^{T}},\) \({{C}_{i}}\ \text{in}\ B\ \text{and}\ {{R}_{j}}\ \text{in}\ A,\) which will generate t again. Thus,

\[{{\left( AB \right)}^{T}}={{B}^{T}}{{A}^{T}}\]

Example -25

A square matrix A is said to be symmetric if \(A={{A}^{T}},\) and skew-symmetric if \(A=-{{A}^{T}}.\) Show that every square matrix can be expressed as the sum of a symmetric and a skew-symmetric matrix.

Solution: If A be the given matrix, then A can be written as 

\[A=\underbrace{\frac{1}{2}\left( A+{{A}^{T}} \right)}_{\text{Symmetric}}\ \ +\ \ \underbrace{\frac{1}{2}\left( A-{{A}^{T}} \right)}_{\text{Skew - Symmetric}}\]

\(A+{{A}^{T}}\) is symmetric because \({{\left( A+{{A}^{T}} \right)}^{T}}={{A}^{T}}+{{\left( {{A}^{T}} \right)}^{T}}={{A}^{T}}+A.\)

\(A-{{A}^{T}}\) skew symmetric because \({{\left( A-{{A}^{T}} \right)}^{T}}={{A}^{T}}-A=-\left( A-{{A}^{T}} \right)\)

For example, let

\[A=\left[ \ \begin{matrix}   1 & -1 & \ \ 0  \\   3 & \ 2 & \ \ 6  \\   2 & \ 4 & -3  \\\end{matrix}\  \right]\]

Then,

\[X=\frac{1}{2}\left( A+{{A}^{T}} \right)=\left[ \ \begin{matrix}   1 & 1 & \ \ 1  \\   1 & \ 2 & \ \ 5  \\   1 & \ 5 & -3  \\\end{matrix}\  \right]Y=\frac{1}{2}\left( A-{{A}^{T}} \right)=\left[ \ \begin{matrix}   0 & -2 & \ -1  \\   2 & \ 0 & \ \ \ \ 1  \\   1 & \ -1 & \ \ \ \ 0  \\\end{matrix}\  \right]\]

Note that \(X+Y=A\)

Example -26

Prove that

(a) \(\widetilde{AB}=\widetilde{B}\widetilde{A}.\)              (b) \({{\left( {{A}^{T}} \right)}^{-1}}={{\left( {{A}^{-1}} \right)}^{T}}.\)

Solution: (a) The solution will involve evaluating \(AB\left( \widetilde{B}\widetilde{A} \right)\ \text{and}\ \widetilde{B}\widetilde{A}\left( AB \right):\)

\[\begin{align}AB\left( \widetilde{B}\widetilde{A} \right)\ &=\ A\left( B\widetilde{B} \right)\widetilde{A}\\&=A\left( \left| B \right|I \right)\widetilde{A}\\&=\left| B \right|\left( A\widetilde{A} \right)\\&=\left| B \right|\left| A \right|=\left| AB \right|=AB\left( \widetilde{A}B \right)\end{align}\]

Similarly,

\[\widetilde{B}\widetilde{A}\left( AB \right)=\widetilde{AB}\left( AB \right)\]

This must mean that \(\widetilde{AB}=\widetilde{B}\widetilde{A}\)

(b) Since

\[\begin{align} &\qquad \qquad A{{A}^{-1}}={{A}^{-1}}A=I, \\ & \qquad \qquad{{\left( A{{A}^{-1}} \right)}^{T}}={{\left( {{A}^{-1}}A \right)}^{T}}=I \\ & \Rightarrow \qquad  {{\left( {{A}^{-1}} \right)}^{T}}{{A}^{T}}={{A}^{T}}{{\left( {{A}^{-1}} \right)}^{T}}=I \\ & \Rightarrow \qquad {{\left( {{A}^{T}} \right)}^{-1}}={{\left( {{A}^{-1}} \right)}^{T}} \\ \end{align}\]

TRY YOURSELF

A square matrix A is said to be orthogonal if  \({{A}^{T}}A=I.\) For example,

\[A=\frac{1}{3}\left[ \ \begin{matrix}   -1 & 2 & -2  \\   -2 & 1 & \ 2  \\   \ 2 & 2 & \ 1  \\\end{matrix}\  \right]\ \ \text{and}\ {{A}^{T}}=\frac{1}{3}\left[ \ \begin{matrix}   -1 & -2 & 2  \\   \ \ 2 & 1 & \ 2  \\   -2 & 2 & \ 1  \\\end{matrix}\  \right]\ .\]

Verify that \({{A}^{T}}A=I.\)

Q1.      Show that the transpose of an orthogonal matrix is also orthogonal

Q2.     Show that every orthogonal matrix is non-singular

Q3.     If A is orthogonal, show that \(\left| A \right|=\pm 1\)

Q4.     Show that the product of two orthogonal matrices is also orthogonal

Q5.     Show that the inverse of an orthogonal matrix is also orthogonal.

Download SOLVED Practice Questions of Terminology and Properties of Matrices for FREE
Determinants and Matrices
grade 11 | Answers Set 2
Determinants and Matrices
grade 11 | Questions Set 1
Determinants and Matrices
grade 11 | Answers Set 1
Determinants and Matrices
grade 11 | Questions Set 2
Download SOLVED Practice Questions of Terminology and Properties of Matrices for FREE
Determinants and Matrices
grade 11 | Answers Set 2
Determinants and Matrices
grade 11 | Questions Set 1
Determinants and Matrices
grade 11 | Answers Set 1
Determinants and Matrices
grade 11 | Questions Set 2
Learn from the best math teachers and top your exams

  • Live one on one classroom and doubt clearing
  • Practice worksheets in and after class for conceptual clarity
  • Personalized curriculum to keep up with school