Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add use_rmm flag to global configuration #6656

Merged
merged 12 commits into from
Mar 9, 2021
Merged

Conversation

hcho3
Copy link
Collaborator

@hcho3 hcho3 commented Jan 28, 2021

Closes #6297. The user should explicitly inform XGBoost if RMM is used. If RMM is used, XGBoost should not allocate from the CUB allocator.

@hcho3 hcho3 mentioned this pull request Jan 28, 2021
23 tasks
@@ -8,7 +8,7 @@ dependencies:
- pyyaml
- cpplint
- pylint
- numpy
- numpy=1.19.4
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a workaround, as the latest shap package appears to break with NumPy 1.19.5:
https://xgboost-ci.net/blue/organizations/jenkins/xgboost/detail/PR-6656/4/pipeline/221

     try:
>       from .. import _cext
E    ImportError: numpy.core.multiarray failed to import

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@trivialfis Have you run into this issue? What's your take on it?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

never seen it before.

What's your take on it?

@RAMitchell worked on the c extension before so maybe have better insight.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using shap from conda-forge also fixes the issue, but unfortunately conda-forge has an outdated version of shap.

@codecov-io
Copy link

codecov-io commented Mar 5, 2021

Codecov Report

Merging #6656 (f850b12) into master (5ae7f99) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #6656   +/-   ##
=======================================
  Coverage   81.83%   81.83%           
=======================================
  Files          13       13           
  Lines        3809     3809           
=======================================
  Hits         3117     3117           
  Misses        692      692           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5ae7f99...f850b12. Read the comment docs.

demo/rmm_plugin/rmm_mgpu_with_dask.py Show resolved Hide resolved
doc/parameter.rst Outdated Show resolved Hide resolved
@hcho3 hcho3 merged commit 366f3cb into dmlc:master Mar 9, 2021
@hcho3 hcho3 deleted the rmm_global_config branch March 9, 2021 22:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add an explicit option to use RMM
3 participants