Determining the left inverse of a non-square matrix

I understand that non-square matrices do not have an inverse, that is, both a left inverse and a right inverse. Yet, I am fairly certain that it is possible for a non-square matrix to have either a left inverse or (exclusively) right inverse. Take the example where, I want to determine the matrix P for which, $$P \left[ \begin 1 & 1 \\ 1 & i \\ 0 & 1+i \\ \end \right] \left[ \begin \lambda_ \\ \lambda_ \\ \end \right] = \left[ \begin 1 \\ 0 \\ i \\ \end \right]$$ It is clear that $P$ must be a $3 \times 3$ matrix, since the column matrix on the right side is $3 \times 1$. How can I determine what this matrix $P$, the left inverse of $\left[ \begin 1 & 1 \\ 1 & i \\ 0 & 1+i \\ \end \right]$, is? The standard methods I know for inverting an $n \times n$ (square) matrix seem not to be working. Thank you.

Samuel Reid asked Feb 12, 2012 at 20:19 Samuel Reid Samuel Reid 5,122 5 5 gold badges 44 44 silver badges 76 76 bronze badges $\begingroup$ You might look up "Moore-Penrose pseudoinverse". $\endgroup$ Commented Feb 12, 2012 at 20:53

1 Answer 1

$\begingroup$

Let $A$ be an $n\times m$ matrix. This matrix represents a linear transformation $L_A\colon\mathbb^m\to\mathbb^n$. A left inverse would correspond to a linear transformation $T\colon\mathbb^n\to\mathbb^m$ such that $T\circ L_A = \mathrm_<\mathbb^m>$. For the composition to be the identity, it is necessary that $L_A$ be one-to-one; in particular, we need $m\leq n$ and for $A$ to be of full rank.

A right inverse would correspond to a linear transformation $U\colon \mathbb^n\to\mathbb^m$ such that $L_A\circ U=\mathrm_<\mathbb^n>$. For the composition to be the identity, we need $L_A$ to be onto; in particular, we need $m\geq n$ and for $A$ to be of full rank.

In other words, a necessary condition for $A$ to have a one-sided inverse is that it be of full rank (that is, $\mathrm(A)=\min(n,m)$).

In fact, the condition is sufficient as well:

Suppose first that $n\leq m$ and $\mathrm(A)=n$. Then $L_A$ is onto, so $A\mathbf_1,\ldots,A\mathbf_m$ span $\mathbb^n$; so we pare the set down to a basis. If $i_1\lt\cdots\lt i_n$ are such that $A\mathbf_$ are a basis for $\mathbb^n$, then define $U\colon\mathbb^n\to \mathbb^m$ by $U(A(\mathbf_)) = \mathbf_$ and extend linearly. Since the $A\mathbf_$ are a basis, this can be done and defines $U$ uniquely; clearly, $L_A\circ U=I_<\mathbb^n>$; computing the coordinate matrix of $U$ relative to the standard basis of $\mathbb^n$ gives a right inverse for $A$.

Next, suppose that $n\geq m$ and $\mathrm(A)=m$. Then $A$ is one-to-one, so $A\mathbf_1,\ldots, A\mathbf_m$ are linearly independent. Complete to a basis for $\mathbb^n$, $A\mathbf_1,\ldots,A\mathbf_m,\mathbf_,\ldots,\mathbf_n$, and define $T\colon \mathbb^n\to\mathbb^m$ by $T(A\mathbf_i)=\mathbf_i$, and $T(\mathbf_j)$ arbitrary (say, $\mathbf$). Then $T\circ L_A=I_<\mathbb^m>$; computing the coordinate matrix of $T$ relative to the standard basis of $\mathbb^n$ gives a left inverse for $A$.

Note, moreover, that if $n\neq m$, then there are many different one-sided inverses for $A$ (when $A$ has full rank); so one should not talk about "the" left (or right) inverse of $A$.