Derivative of a custom loss function with the logistic function

I have costum loss function with $\mu ,p, o, u, v$ as variables and $\sigma$ is the logistic function.

I need to derive this loss function. Due to multiple variables in the loss function, I need to use the softmax function which is the generalization of the logistic function?

$L = -\frac{1}{N}\sum_{i,j \in S}^{2}{a_j\{y_{i,j}log[\sigma{(\mu + p_i + o_j + u^{T}_{i}v_{j})]} + (1 - y_{i,j})log[1 - \sigma{(\mu + p_i + o_j + u^{T}_{i}v_{j})]}\}}$

As far I understand, it is a multivariate gradient descent problem so I need to provide 4 different update rules (doing partial derivative for each of the variables as the others are constants) but how I merge them into one update rule or maybe there is only one update rule from the beginning, I got pretty confused when it's become multivariable problem.

Topic softmax mathematics gradient-descent logistic-regression

Category Data Science


Each parameter will use the partial derivative of Loss w.r.t the parameter i.e.

$\mu = \mu - {\partial L}/{\partial \mu}$
$p = p - {\partial L}/{\partial p}$
$o = o - {\partial L}/{\partial o}$
$u = u - {\partial L}/{\partial u}$
$v = v - {\partial L}/{\partial v}$

If the direct derivative is not possible, then may need the chain-rule.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.