-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues on multiclass classification classifier with xgboost #7688
Comments
https://github.com/dmlc/xgboost/blob/master/demo/guide-python/sklearn_examples.py
That means the users need to specify the python arguments as keyword arguments:
Apologies for the ambiguity. What I was trying to say is, XGBoost won't support multi-target-multi-class classification. That's when the data contains multiple output targets, and each target is a classification problem with multiple classes. Multi-class means your data have only 1 target, which has multiple classes. Multi-label means you have multiple targets, each target has 2 classes. |
Thank you so much for the quick response and clarification! 2 following questions:
|
1-vs-rest
Correct. The feature is pretty basic at the moment. You can achieve the same result using sklearn meta estimators. |
Hi @trivialfis
Thank you so much for your help! |
@PanZiwei No, you cannot use |
How about the parameter So the imbalanced dataset can be only learned in the |
Hi,
I am interested in developing a multi-class classifier with XGboost but I am still a newborn. I would really appreciate it if you can help with the following issues.
It seems that when the class number is greater than 2, it will modify the obj parameter to multi:softmax. But since the code is under @_deprecate_positional_args, is that mean that the multi-class classifier is no longer available with current XGboost?
xgboost/python-package/xgboost/sklearn.py
Lines 1365 to 1369 in e56d177
How can I use the latest stable XGboost (v1.5.2 by far) to develop the multi-class classifier? Do you think it makes sense to add a single output model with sklearn.multioutput.MultiOutputClassfier, something like
classifier = MultiOutputClassifier(XGBClassifier())
?You mentioned in Initial support for multi-output regression. #7309 (comment) that "I think for multi-target classification models, xgboost needs a new interface and potentially lots of refactoring. I will focus on regression for now. Thank you for joining the discussion and feel free to test the new feature. ;-)"
Can you be more specific on the term multi-target? Is it for multi-class classification or multi-label classification or multi-output classification?
Thank you so much for your help.
The text was updated successfully, but these errors were encountered: