Can XGBoost support vector outputs?
I am interested in fitting data (regression rather than classification) with individual targets which are vectors via an XGBoost type model. However, currently Python's xgboost.XGBRegressor model only supports scalar targets.
Looking at the original paper defining the algorithm, it seems possible we could just extend their methods using a vectorized form: Paper here
Following their notation, if one simply assumed that $f_t(x_i)$ is a vector in $\mathbb{R}^k$ I think the multi-dimensional analogue of equation (6) would be something like:
$$\tilde{\mathcal{L}}^{(t)}(q) = - \frac{1}{2} \sum_{j=1}^T \left( \left[ \sum_{i \in I_j} \nabla l_i \right]^T \left( \sum_{i \in I_j} H_i + \lambda I \right)^{-1} \left[ \sum_{i \in I_j} \nabla l_i \right] \right) + \gamma T$$
Where $l$ is the loss function with $k \times 1$ gradient $\nabla l_i$ and $k \times k$ Hessian $H_i$ at data point number $i$.
Does this extension work mathematically? And has anyone implemented such an algorithm anywhere?
Topic theory xgboost optimization
Category Data Science