An [[Orthogonal Matrix]] is a type of [[Matrix|Square Matrix]] such that the corresponding [[Linear Transformation]] perserves angles and does not scale.
A [[Matrix]] can only be [[Orthogonal]] [[Bijective|if and only if]] its [[Inverse Matrices|Inverse]] equals its [[Matrix Transpose|Transpose]].
$\huge A^{-1} = A^{T} $
>[!info]- Proof
>$\begin{align}
>A &= \mat{ u_{1}&u_{2}&\cdots& u_{n}} \\
>A^{\intercal} &= \mat{ u_{1}\\u_{2}\\ \vdots\\ u_{n}} \\
> A^{\intercal}A &=\mat{
> u_{1}^{\intercal}u_{1} & u_{1}^{\intercal}u_{2} & \cdots \\
> u_{2}^{\intercal}u_{1} & u_{2}^{\intercal}u_{2} & \cdots \\
>\vdots & \vdots & \ddots
> } \\
>&= \mat{
>
>|| u_{1} ||^{2} & 0 & 0 & \cdots \\
>0 & ||u_{2}||^{2} & 0 & \cdots \\
>\vdots & \vdots & \vdots & \ddots
>} \\
>&= I
>\end{align}$
>[!tip]
>Another definition is that a [[Orthogonal Matrix]] $A$ is one such that each [[Vector|Column Vector]] $A_{1},A_{2},\dots$ form an [[Orthonormal Basis]].
[[Universal Quantifier|For any]] two [[Vector|Vectors]] $\vec u, \vec v \in \R^{n}$, the [[Vector Magnitude|Magnitudes]] of each [[Vector]] is unchanged when [[Matrix Product|multiplied]] by the matrix, and the angle $\theta$ between $\vec u$ and $\vec v$ is equal to the angle between $A\vec u$ and $A\vec v$.
$\large\begin{align}
\forall \vec v, \vec u &\in V\\
\lvert \lvert \vec u \rvert \rvert &= \lvert \lvert A\vec u \rvert \rvert \\
\lvert \lvert \vec v \rvert \rvert &= \lvert \lvert A\vec v \rvert \rvert \\
\theta_{\angle \vec u, \vec v} &= \theta_{\angle A\vec u, A\vec v} \\
\end{align} $
A more consise way of expressing this relationship is to say that the [[Dot Product]] between the two [[Vector|Vectors]] is equal to the [[Dot Product]] between the transformed [[Vector|Vectors]].
$\huge \vec u \cdot \vec v = A\vec u \cdot A\vec v $
The composition of multiple [[Orthogonal Matrix|orthogonal matrices]] will produce another orthogonal matrix. Succinctly, [[Orthogonal|orthogonality]] of matrices is [[Closure|closed]] under composition.
### [[Gram Matrix]]
The [[Gram Matrix]] of an [[Orthogonal Matrix]] is the [[Identity Matrix]].
>[!tldr] Proof for the [[Gram Matrix]] being the [[Identity Function]].
>$\large
>\begin{align}
>G_{A} &= AA^T \\
>&= AA^{-1} \\
>&= I
>\end{align}
>$
### Column Vectors
[[Universal Quantifier|For any]] [[Orthogonal Matrix]] $A \in M_{n\times n}$, all [[Vector|Column Vectors]], $\vec A_{1}, \dots, \vec A_{n}$, must be a [[Unit Vector]] and must be [[Orthogonal]] to all other [[Vector|Column Vectors]].
### [[Determinant]]
The [[Determinant]] of a [[Orthogonal Matrix]] must be $1$ or $-1$ (note: this is not a [[Bijective]] [[Proposition|Statement]]).
### [[Eigenvector|Eigenvectors]]
[[Universal Quantifier|For All]] [[Eigenvalue|Eigenvalues]] of a [[Orthogonal Matrix]] $A$, the [[Absolute Value]] of the [[Eigenvalue]] must be $1$.
$\huge \, \lvert \lambda_i \rvert = 1 $