How do i get the loss function graph?
I used Mini-batch gradient descent to train the model, but i am unable to get the proper loss graph. The loss graph is always showed as a straight line. I know there is something wrong but would anyone be able to guide me
from sklearn import metrics
error = []
for epoch in range(epochs):
for i in range(0,x_train.shape[0],minibatch_size):
x_mini = x_train[i:i + minibatch_size-1,:]
y_mini = y_train[i:i + minibatch_size-1,:]
#feed forward
#layer 1
in1 = x_mini@w1 + b1
out1 = sigmoid(in1)
#layer 2
in2 = out1@w2+b2
out2 = sigmoid(in2)
if epoch 10:
error.append(0.5*np.power((y_mini-out2),2).mean())
z=np.arange(len(error))
plt.plot(z,error,label=train,color='red')
plt.legend(loc='best')
plt.show())
I wanted the graph for loss before training and after but cant seen get a proper one