From what function do come the gradients that I use to adjust weights?

I have a question about the loss function and the gradient.

So I'm following the fastai (https://github.com/fastai/fastbook) course and at the end of 4th chapter, I got myself wondering.

From what function do come the gradients that I use to adjust weights?

I do understand that loss function is being derivated. But which? Can I see it? Or is it under the hood of PyTorch?

Code of the step function:

def step(self):
    self.w.data -= self.w.grad.data * self.lr
    self.b.data -= self.b.grad.data * self.lr

So I'm interested in the origin of w.grad/b.grad.

Topic fastai pytorch loss-function gradient-descent

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.