How do i get the loss function graph?

I used Mini-batch gradient descent to train the model, but i am unable to get the proper loss graph. The loss graph is always showed as a straight line. I know there is something wrong but would anyone be able to guide me

from sklearn import metrics

error = []

for epoch in range(epochs):
    for i in range(0,x_train.shape[0],minibatch_size):
        x_mini = x_train[i:i + minibatch_size-1,:]
        y_mini = y_train[i:i + minibatch_size-1,:]
        
        #feed forward
        #layer 1
        in1 = x_mini@w1 + b1
        out1 = sigmoid(in1)
        
        #layer 2
        in2 = out1@w2+b2
        out2 = sigmoid(in2)
        
    if epoch  10:
        error.append(0.5*np.power((y_mini-out2),2).mean())
    
  
z=np.arange(len(error))
plt.plot(z,error,label=train,color='red')
plt.legend(loc='best')
plt.show())

I wanted the graph for loss before training and after but cant seen get a proper one

Topic matplotlib mini-batch-gradient-descent gradient-descent deep-learning python

Category Data Science


The code you provided only performs the forward propagation of a neural network without using gradient descent to backpropagate the loss. This means that your network parameters do not change, i.e. your model is not learning anything. The model will always output the same value for an input in your training data and the loss will always stays the same.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.