Decision Tree Classifier Python Code Example

In this post, you will learn about how to train a decision tree classifier machine learning model using Python. The following points will be covered in this post:

  • What is decision tree?
  • Decision tree python code sample

What Is a Decision Tree?

Simply speaking, the decision tree algorithm breaks the data points into decision nodes resulting in a tree structure. The decision nodes represent the question based on which the data is split further into two or more child nodes. The tree is created until the data points at a specific child node is pure (all data belongs to one class). The criteria for creating the most optimal decision questions is the information gain. The diagram below represents a sample decision tree.

Fig 1. Sample Decision tree

Training a machine learning model using a decision tree classification algorithm is about finding the decision tree boundaries.

Decision trees build complex decision boundaries by dividing the feature space into rectangles. Here is a sample of how decision boundaries look like after model trained using a decision tree algorithm classifies the Sklearn IRIS data points. The feature space consists of two features namely petal length and petal width. The code sample is given later below.

Fig 2. Decision boundaries created by a decision tree classifier

Decision Tree Python Code Sample

Here is the code sample which can be used to train a decision tree classifier.

Visualizing Decision Tree Model Decision Boundaries

Here is the code which can be used to create the decision tree boundaries shown in fig 2. Note that the package mlxtend is used for creating decision tree boundaries.

Visualizing Decision Tree in the Tree Structure

Here is the code which can be used visualize the tree structure created as part of training the model. plot_tree function from sklearn tree class is used to create the tree structure. Here is the code:

Here is how the tree would look after the tree is drawn using the above command. Note the usage of plt.subplots(figsize=(10, 10)) for creating a larger diagram of the tree. Otherwise, the tree created is very small.

Fig 3. Decision tree visualization

In the follow-up article, you will learn about how to draw nicer visualizations of a decision tree using package. Also, you will learn some key concepts in relation to decision tree classifier such as information gain (entropy, gini, etc).

This UrIoTNews article is syndicated fromDzone