Skip to content

Commit

Permalink
[doc] Fix typo. [skip ci] (#7311)
Browse files Browse the repository at this point in the history
  • Loading branch information
trivialfis authored Oct 12, 2021
1 parent 0bd8f21 commit 406c70b
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions doc/tutorials/categorical.rst
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ can plot the model and calculate the global feature importance:
# Get a graph
graph = xgb.to_graphviz(clf, num_trees=1)
# Or get a matplotlib axis
ax = xgb.plot_tree(reg, num_trees=1)
ax = xgb.plot_tree(clf, num_trees=1)
# Get feature importances
clf.feature_importances_
Expand All @@ -60,8 +60,8 @@ idea is create dataframe with category feature type, and tell XGBoost to use ``g
with parameter ``enable_categorical``. See `this demo
<https://github.com/dmlc/xgboost/blob/master/demo/guide-python/categorical.py>`_ for a
worked example using categorical data with ``scikit-learn`` interface. For using it with
the Kaggle tutorial dataset, see `<this demo
https://github.com/dmlc/xgboost/blob/master/demo/guide-python/cat_in_the_dat.py>`_
the Kaggle tutorial dataset, see `this demo
<https://github.com/dmlc/xgboost/blob/master/demo/guide-python/cat_in_the_dat.py>`_


**********************
Expand Down Expand Up @@ -114,5 +114,5 @@ Next Steps
**********

As of XGBoost 1.5, the feature is highly experimental and have limited features like CPU
training is not yet supported. Please see `<this issue>
https://github.com/dmlc/xgboost/issues/6503`_ for progress.
training is not yet supported. Please see `this issue
<https://github.com/dmlc/xgboost/issues/6503>`_ for progress.

0 comments on commit 406c70b

Please sign in to comment.