Multivariate Gaussian distribution - Covariance vs linear dependence

From prof. Andrew Ng's Multivariate Gaussian distribution lecture, covariance measures linear dependency between features, in which case we might use Multivariate Gaussian distribution with covariance matrix. And also, if features are redundant (for ex: x1= 2 * x2; clearly linear dependency exists between features), covariance matrix is not invertible and can't use Multivariate Gaussian distribution with covariance matrix. For me, these statements looks contradictory.



Question: Whats difference between covariance - linear dependency and features linear dependency?

Topic gaussian anomaly-detection clustering machine-learning

Category Data Science


Assuming the data is a multivariate Gaussian distribution, the covariance matrix is a measure of the linear relationship between the features.

If one feature is a linear combination of other features, the data can still assumed to be a multivariate Gaussian distribution. However, since there is the linear dependence the matrix is singular and the inverse can not be calculated.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.