You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The PDF can be expressed compactly in terms of this distance as $f(\mathbf{x}) = (2\pi)^{-k/2} |\boldsymbol{\Sigma}|^{-1/2} \exp(-D^2/2)$. Surfaces of constant density are ellipsoids defined by $D^2 = c$, and the squared Mahalanobis distance follows a chi-squared distribution: $D^2 \sim \chi^2_k$. This property is used for multivariate outlier detection -- a point is flagged as an outlier if $D^2$ exceeds the $\chi^2_k$ critical value at the desired significance level.
30
+
The PDF can be expressed compactly in terms of this distance as $`f(\mathbf{x}) = (2\pi)^{-k/2} |\boldsymbol{\Sigma}|^{-1/2} \exp(-D^2/2)`$. Surfaces of constant density are ellipsoids defined by $D^2 = c$, and the squared Mahalanobis distance follows a chi-squared distribution: $D^2 \sim \chi^2_k$. This property is used for multivariate outlier detection -- a point is flagged as an outlier if $D^2$ exceeds the $\chi^2_k$ critical value at the desired significance level.
31
31
32
32
**Marginal distributions.** Any subset of variables from a multivariate normal is itself multivariate normal. If the full vector is partitioned as $\mathbf{X} = (\mathbf{X}_a, \mathbf{X}_b)^T$ with corresponding partitioned mean and covariance:
0 commit comments