What is meant by Distributed for a gradient boosting library?
I am checking out XGBoost documentation and it's stated that XGBoost is an optimized distributed gradient boosting library.
What is meant by distributed?
Have a nice day
Topic boosting xgboost distributed
Category Data Science