Derivative of a custom loss function with the logistic function
I have costum loss function with $\mu ,p, o, u, v$ as variables and $\sigma$ is the logistic function.
I need to derive this loss function. Due to multiple variables in the loss function, I need to use the softmax function which is the generalization of the logistic function?
$L = -\frac{1}{N}\sum_{i,j \in S}^{2}{a_j\{y_{i,j}log[\sigma{(\mu + p_i + o_j + u^{T}_{i}v_{j})]} + (1 - y_{i,j})log[1 - \sigma{(\mu + p_i + o_j + u^{T}_{i}v_{j})]}\}}$
As far I understand, it is a multivariate gradient descent problem so I need to provide 4 different update rules (doing partial derivative for each of the variables as the others are constants) but how I merge them into one update rule or maybe there is only one update rule from the beginning, I got pretty confused when it's become multivariable problem.
Topic softmax mathematics gradient-descent logistic-regression
Category Data Science