-
Notifications
You must be signed in to change notification settings - Fork 284
decision_tree
#Decision Tree
##Description This class implements a basic Decision Tree classifier. Decision Trees are conceptually simple classifiers that work well on even complex classification tasks.
Decision Trees partition the feature space into a set of rectangular regions, classifying a new datum by finding which region it belongs to.
The Decision Tree algorithm is part of the GRT classification modules.
##Advantages The Decision Tree algorithm is a good algorithm to use for the classification of static postures and non-temporal pattern recognition. The main advantage of a Decision Tree is that the model is particularly fast at classifying new input samples. The GRT Decision Tree algorithm enables you to define your own custom Decision Tree Nodes (the logic/model used to define how a Decision Tree should spilt the data at each node in the tree), giving you a lot of flexibility to which classification tasks the Decision Tree is applied to.
##Disadvantages The main limitation of the Decision Tree algorithm is that very large models will frequently overfit the training data. To prevent overfitting, you should experiment with the maximum depth of the Decision Tree, or you can use a GRT Random Forest algorithm to combine multiple Decision Trees together into one ensemble classifier.
##Training Data Format You should use the ClassificationData data structure to train the Decision Tree classifier.
##Example Code Decision Tree Example