Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Gradient Boosted Feature Selection #508

Closed
tomtung opened this issue Sep 19, 2015 · 1 comment
Closed

Implement Gradient Boosted Feature Selection #508

tomtung opened this issue Sep 19, 2015 · 1 comment

Comments

@tomtung
Copy link

tomtung commented Sep 19, 2015

as described in http://alicezheng.org/papers/gbfs.pdf

Basically it seems like: when measuring the goodness of a tree node split (for learning a regression tree), add a tunable constant penalty if the feature used for splitting hasn't been used by existing trees.

Seems quite simple, and can be a nice alternative for people who use feature importance to select features.

@tqchen
Copy link
Member

tqchen commented Jan 15, 2016

Let us move roadmap issues to #574 so it can be tracked in a more easier way

@tqchen tqchen closed this as completed Jan 15, 2016
@lock lock bot locked as resolved and limited conversation to collaborators Oct 26, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants