Regarding Keras mean_squared_error losses

I am trying to do a RandomizedSearchCV for a simple regression task. Essentially two inputs to one output.

Upon inspecting the resulting models, it appears that the 'best' model has a mean_squared_error of around -8.3, while there were other 'non-best' models which had mean_squared_error close to 0 (i.e. around 0.004).

In this case, may I know if the Keras optimized model selection is correct? Why does it select one with mean_squared_error not close to zero?

Shouldn't the best model give MSE close to zero? I understand that the optimization seeks to minimize loss, but how can I interpret this?

Topic neural keras regression python machine-learning

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.