diff --git a/doc/tutorials/categorical.rst b/doc/tutorials/categorical.rst index 6ee724d45b6a..9598512e34d0 100644 --- a/doc/tutorials/categorical.rst +++ b/doc/tutorials/categorical.rst @@ -50,7 +50,7 @@ can plot the model and calculate the global feature importance: # Get a graph graph = xgb.to_graphviz(clf, num_trees=1) # Or get a matplotlib axis - ax = xgb.plot_tree(reg, num_trees=1) + ax = xgb.plot_tree(clf, num_trees=1) # Get feature importances clf.feature_importances_ @@ -60,8 +60,8 @@ idea is create dataframe with category feature type, and tell XGBoost to use ``g with parameter ``enable_categorical``. See `this demo `_ for a worked example using categorical data with ``scikit-learn`` interface. For using it with -the Kaggle tutorial dataset, see ``_ +the Kaggle tutorial dataset, see `this demo +`_ ********************** @@ -114,5 +114,5 @@ Next Steps ********** As of XGBoost 1.5, the feature is highly experimental and have limited features like CPU -training is not yet supported. Please see ` -https://github.com/dmlc/xgboost/issues/6503`_ for progress. +training is not yet supported. Please see `this issue +`_ for progress.