From 406c70ba0e831babce4855d48793df6924e21cbf Mon Sep 17 00:00:00 2001 From: Jiaming Yuan Date: Tue, 12 Oct 2021 19:10:18 +0800 Subject: [PATCH] [doc] Fix typo. [skip ci] (#7311) --- doc/tutorials/categorical.rst | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/doc/tutorials/categorical.rst b/doc/tutorials/categorical.rst index 6ee724d45b6a..9598512e34d0 100644 --- a/doc/tutorials/categorical.rst +++ b/doc/tutorials/categorical.rst @@ -50,7 +50,7 @@ can plot the model and calculate the global feature importance: # Get a graph graph = xgb.to_graphviz(clf, num_trees=1) # Or get a matplotlib axis - ax = xgb.plot_tree(reg, num_trees=1) + ax = xgb.plot_tree(clf, num_trees=1) # Get feature importances clf.feature_importances_ @@ -60,8 +60,8 @@ idea is create dataframe with category feature type, and tell XGBoost to use ``g with parameter ``enable_categorical``. See `this demo `_ for a worked example using categorical data with ``scikit-learn`` interface. For using it with -the Kaggle tutorial dataset, see ``_ +the Kaggle tutorial dataset, see `this demo +`_ ********************** @@ -114,5 +114,5 @@ Next Steps ********** As of XGBoost 1.5, the feature is highly experimental and have limited features like CPU -training is not yet supported. Please see ` -https://github.com/dmlc/xgboost/issues/6503`_ for progress. +training is not yet supported. Please see `this issue +`_ for progress.