You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Basically it seems like: when measuring the goodness of a tree node split (for learning a regression tree), add a tunable constant penalty if the feature used for splitting hasn't been used by existing trees.
Seems quite simple, and can be a nice alternative for people who use feature importance to select features.
The text was updated successfully, but these errors were encountered:
as described in http://alicezheng.org/papers/gbfs.pdf
Basically it seems like: when measuring the goodness of a tree node split (for learning a regression tree), add a tunable constant penalty if the feature used for splitting hasn't been used by existing trees.
Seems quite simple, and can be a nice alternative for people who use feature importance to select features.
The text was updated successfully, but these errors were encountered: