Which Neural Network or Gradient Boosting framework is the simplest for Custom Loss Functions?

I need to implement a custom loss function.

The function is relatively simple:

$$-\sum \limits_{i=1}^m [O_{1,i} \cdot y_i-1] \ \cdot \ \operatorname{ReLu}(O_{1,i} \cdot \hat{y_i} - 1)$$

With $O$ being some external attribute specific to each case.

I was initially working with LightGBM, but I only found tutorials that included calculating the hessian and the gradient. If there is a way to add the function without this please correct me.

Otherwise I am open to using other libraries. PyTorch-Fastai, Tensorflow-keras, catboost, etc. would all be fine.

Eventually, I would like to get my hands dirty with the math, but first I would like to run the model.

Topic lightgbm gradient loss-function neural-network python

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.