Custom loss function for regression

I am trying to write a custom loss function for a machine learning regression task. What I want to accomplish is following:

  • Reward higher preds, higher targets
  • Punish higher preds, lower targets
  • Ignore lower preds, lower targets
  • Ignore lower preds, higher targets

All ideas are welcome, pseudo code or python code works good for me.

This is what I tried so far, it does not work so well I think it is because it does not take high targets into account (just high preds):

def mae_high(inp, targ):
    inp, targ = flatten_check(inp, targ)
    thresh = np.percentile(inp.detach().numpy(), 50)
    mask = inp  thresh
    high_preds = torch.masked_select(inp, mask)
    high_targ = torch.masked_select(targ, mask)
    return torch.abs(high_preds - high_targ).mean()

Topic loss loss-function deep-learning python machine-learning

Category Data Science


Not exactly sure what you want to achieve and in what kind of setting. There are some well-known loss functions which you might have a look at.

One option is the Huber-Loss which avoids very large residuals for "high" values and thus can lead to a more balanced prediction. It is a mix of L1 and L2 loss.

Another more flexible loss function is the "fair loss", which can be tuned to some extent as far as I remember (it is not well documented).

If Huber is just the opposit of what you want, you could try to "flip it around" and assign L1 loss logic to "low values" and L2 to high values.

I tried to implement both Huber Loss and Fair Loss in XGBoost.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.