You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tested xgb.cv() method against various scoring metrics on both CPU and GPU for classification models. Using xgboost version 1.4.0 release from pip produced unexpected results with metric='auc'. Other metrics such as ['aucpr', 'logloss', 'map'] appears to be consistent.
Running the same code under xgboost version 1.3.3 produced consistent results for CPU/GPU scoring with metric='auc' as to be expected.
For binary case (this issue), the result is different between runs, I narrowed it down to the Argsort. If radix sort is stable, then my guess is somewhere it uses floating point which generates instability.
Tested xgb.cv() method against various scoring metrics on both CPU and GPU for classification models. Using xgboost version 1.4.0 release from pip produced unexpected results with metric='auc'. Other metrics such as ['aucpr', 'logloss', 'map'] appears to be consistent.
Running the same code under xgboost version 1.3.3 produced consistent results for CPU/GPU scoring with metric='auc' as to be expected.
Attached is a notebook and sample data to reproduce the issue:
xgb_cv_v1.4.0_auc_metric_on_gpu_broken.zip
The text was updated successfully, but these errors were encountered: