Processes | Predictive Modeling | XG Boost Regression

XG Boost Regression
The XGBoost Regression process builds predictive models using the extreme gradient boosting machine learning method that allows for numerous, potentially correlated predictor variables. Computations are performed using the XGBoost Python Package, and both linear and tree regression can be fitted in this process1. A complete guide to the XGBoost parameters can be found at
What do I need?
One SAS data set is required: An input data set with one column per predictor variable (feature) and response variable (target).
The output generated by this process is summarized in a Tabbed report. Refer to the XG Boost Regression output documentation for detailed descriptions and guides to interpreting your results.

Chen, T., and Guestrin, C. 2016. XgBoost: A Scalable Tree Boosting System, in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 785-794. doi: 10.1145/2939672.2939785