![]() ![]() The approximate algorithm performs bucketing on the weighted quantiles of each dimension. The algorithm for constructing a tree includes an accurate algorithm and an approximate algorithm.Similar to the learning rate, after learning a tree, the weight is reduced, thereby reducing the role of the tree and improving the learning space.XGBoost supports column sampling, similar to random forests, sampling attributes when building each tree, training speed is fast, and the effect is good.XGBoost's objective function optimization utilizes the second derivative of the loss function with respect to the function to be sought, while GBDT only uses the first-order information.Penalizes the weight of the leaf node, which is equivalent to adding a regular term to prevent overfitting.Compared with the general GBDT algorithm, XGBoost has the following advantages: XGBoostIt is a distributed and efficient gradient lifting algorithm based on decision tree (CART), which can be applied to classification, regression, sorting and other tasks.
0 Comments
Leave a Reply. |