Why is my loss so high?

I am struggling to understand why I am getting such a high loss/val_loss rate on my training. I am training a regression network. I've normalized the input data to range between -1 to 1, and left the output data unaltered, its range is approximately between -100 and 100.

I chose to normalize the input so that I could use tanh as the activation function since it outputs within this range.

The neural network consists of 3 layers.

  print Model definition!
    model = Sequential()

    #act = PReLU(init='normal', weights=None)
    model.add(Dense(output_dim=400,input_dim=400, init=normal,activation=K.tanh))

    #act1 = PReLU(init='normal', weights=None)
    model.add(Dense(output_dim=400,input_dim=400, init=normal,activation=K.tanh))

    #act2 = PReLU(init='normal', weights=None)
    #model.add(Dense(output_dim=400, input_dim=400, init=normal,activation=K.tanh))

    act4=ELU(100)
    model.add(Dense(output_dim=13, input_dim=400, init=normal,activation=act4))

The mapping between the input and output consists of mapping audio samples into MFCC features. The samples are the ones i've normalized to the aforementioned range.

Why am I getting these results?

Am I doing anything that is unclear?

Normalizing the output_range +-1:

Why is the val_loss loss lower than the loss?

Topic loss-function hyperparameter neural-network

Category Data Science


As your two training curves show, neural networks are sensitive to a range of both features and targets. Normalizing both features and targets can facilitate learning.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.