What is correct equation for LR decision boundary?

I read that the equation perceptron decision boundary is given as follows:$$w^Tx-w_0=0$$

This can be proven as follows:

Assuming $w$ is a unit vector (as we can multiply above equation with a constant to make $w$ a unit vector and the equation will still hold), by the definition of vector dot product, $$w^T.x=\Vert w\Vert.p_{x\rightarrow w}=1.w_0$$ where $p_{x\rightarrow w}$ is a projection of $x$ on $w$ and $\Vert w\Vert$ is a magnitude of $w$.

This gives: $$w^Tx-w_0=0$$

But I also came across text and videos specifying it as $$w^Tx\color{red}{+}w_0=0$$

Q1. Is specifying $\color{red}{+}$ sign for $w_0$ correct and negative sign incorrect? Or its just the convention to specify linear equation?

Q2. I also got new doubt: how is $p_{x\rightarrow w}=w_0$?

Topic bias perceptron logistic-regression neural-network

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.