$\huge \begin{align*}
A \vec x &= \lambda \vec x
\end{align*}$
$\vec x$ is an *Eigenvector* of a square [[Matrix]] $A$, and $\lambda$ is an [[Eigenvalue]] of $A$.
#### Fixed Points ($\lambda=1$)
All [[Fixed Points]] of a [[Matrix]] [[Function|Transformation]] $A$ are an [[Eigenvector]] of $A$ with an [[Eigenvalue]] of $1$.
$
\begin{align}
A\vec x &= \vec x\\
A\vec x - \vec x &= \vec 0\\
(A-I)\vec x &= \vec 0 \\
\augmented{c|c}{A-I&\vec 0}
\end{align}
$
This means that [[Existential Quantifier|there exists]] a non zero fixed point of $A$ [[Biconditional|if and only if]]:
- The column [[Vector|Vectors]] of $A-I$ are [[Linear Dependence|Linearly Dependent]]
- $\op{rank}(A-I) < n$
- $A-I$ is *not* [[Inverse Matrices|Invertable]]
- The [[Determinant]] of $A-I$ is $0$, $\det (A-I)=0$.