What is difference between Standard Normal Distribution and Mean Normalization approaches to feature-scaling?

The tag feature-scaling seems to convey that one of the scaling methods is Standard Normal Distribution. Further, I read an Answer on this site saying that Mean Normalization is a form of feature scaling.

What is the difference between two approaches to scaling?

Note: I think that statistics and mathematics of normalization do differ.

Topic normalization feature-scaling sampling statistics definitions

Category Data Science


The terms standardization and normalization are often used interchangeably. However, strictly speaking they do refer to distinct feature transformations.

Normalization

Normalization, also called feature scaling usually means scaling the data between 0 and 1. There are many approaches that can be used to achieve this. One common way is by

$x' = \frac{x - x_{min}}{x_{max} - x_{min}}$

Standardization

Standardization transforms the feature to have a mean 0 and a standard deviation of 1. This is also called z-scoring and can be achieved by

$x_i' = \frac{x_i - \bar{x}}{s}$

where $\bar{x}$ is the mean of the feature and $s$ is the standard deviation of the feature.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.