The Logistic [[Function]] (or [[Logistic Function|Sigmoid Function]]) $\sigma(x)$ is a named [[Real Numbers|real]] values function used for [[Probability]] and [[Machine Learning]].
$\huge \sigma(x) = \frac{1}{1+e^{-x}} $
$\huge
\sigma(x) = \frac{1}{2}\pa{1 + \op{arctanh}\left( \frac{x}{2} \right)} $
### Properties
$\huge \begin{align}
\sigma(0) &= \frac{1}{2} \\
\lim_{ x \to \infty } \sigma(x) &= 1\\
\lim_{ x \to -\infty } \sigma(x) &= 0 \\
\end{align}$
Reflection Definition:
$\huge \sigma(-x) = 1 - \sigma(x)
$
[[Derivative]]:
$\huge \begin{align}
\sigma'(x) &= \frac{e^{-x}}{(1+e^{-x})^{2}} \\
&= \frac{1}{1+e^{-x}}
\frac{e^{-x}}{1+e^{-x}} \\
&= \sigma(x) \sigma(-x) \\
&= \sigma(x)(1-\sigma(x))
\end{align}
$
### Modification
$\huge \begin{align}
\sigma(x;a,b) &= \frac{1}{1+e^{-b(x-a)}}
\end{align}$
Where $a$ determines the 'center' and $b$ determines the spread.
You can extend this to any dimension input as the following:
$\huge
\sigma(\mathbf{\vec x};a,b) =
\frac{1}{
1+ e^{-(\vec c \cdot \mathbf{ \vec x})}
}
$
Where $\vec c$ is a [[Vector]] of constants (ex. $\vec c=\mat{a\\b}$)
>[!info]
To fit this model to some dataset, you can treat this as a two variable [[Function]] (of $a,b$) and use something like [[Gradient Descent]].
>See Likelihood ![[202601261555]]