Is there a gradient descent-based optimization algorithm that works with non-linear constraints?

I have a function to optimize with ca. 200 parameters + one constraint (sum of squares of the parameters must be equal one)

This problem can be solved using Lagrange Multipliers and my intuition tells me, that methods that do that must be readily available.

If I had a choice, I would prefer an algorithm existing on JuMP.jl

Topic gradient-descent julia

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.