Handling bias inputs during normalization

Suppose I have an input matrix $\mathbf X\in \mathbb R^{(D+1)\times N}$ where $N$ is number of samples $D$ is dimension of an input vector $x$ and extra $1$ dimension is for bias where all bias entries are $1$. If I want to normalize all inputs by subtracting mean and dividing by standard deviation how should I handle bias entries? Should they stay same as $1$

Topic bias normalization

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.