Decision Trees, Random Forests, AdaBoost & XGBoost in Python
In this free online course, learn about the methods involved in decision trees and ensemble techniques in Python.Publisher: Start-Tech Academy
CertificationView course modules
Do you want to be an expert in using decision trees to solve complex business problems? This course covers all the steps you should take when solving business problems using decision trees in Python. Business analysts and data scientists widely use tree-based decision models. They use decision trees because they are easy to interpret and make getting to a clear decision point uncomplicated. You will use them to present business concepts to people who might not be enthusiastic about numbers and complex mathematical models. Discover that a decision tree is a business decision support tool that uses a tree-like model of decisions and possible outcomes that include event outcomes, resource costs and utilities. It is a way of displaying an algorithm that only contains conditional control statements. The course will discuss the essential directories in Python and the process of building a machine learning model.
At first, you will be introduced to the three methods of opening the Jupyter Notebook; Anaconda navigator, Anaconda prompt, and Command prompt. Then, you will find out that regression trees are suitable for continuous quantitative target variables, whilst classification trees are ideal for discrete categorical target variables. You will understand the steps to take when building a regression tree, gain the ability to manipulate the decision tree and interpret the results more efficiently than someone who only knows the technical aspects of making decision trees on Python. Discover that the recursive binary splitting top-down approach is a greedy approach. At each step of the tree-building process, the best split is made at that particular step, rather than looking ahead and picking a split that will lead to a better tree in some future action. The course will discuss missing value treatment in Python and dummy variable creation.
Next, you will learn that when running the code in Python, you need to specify the stopping criteria for the decision tree to control the tree length and avoid the overfitting problem. The three methods of controlling the decision tree’s growth specify the minimum observations required at the internal node, setting the minimum observations required at the leaf node and giving the tree’s maximum depth. Gain insight into the three main prediction models based on decision trees bagging, random forest and boosting techniques. You will then understand that decision trees closely mirror human decision-making than other regression and classification approaches. They can easily handle qualitative predictors without the need to create dummy variables. Finally, you will study how to evaluate a model’s performance in Python, plotting a decision tree, data analysis, and ensemble techniques. This course will be of interest to data scientists, executives, or students interested in learning about decision trees. Why wait? Start this course today and become a decision tree and problem-solving expert.Start Course Now
Machine Learning and Decision Trees
Machine Learning and Decision Trees - Learning Outcomes
Setting Up Python
Diverse Libraries in Python
Simple Decision Trees
Creation and Pruning of Decision Trees
Simple Classification Tree
Machine Learning and Decision Trees - Lesson Summary
Ensemble Techniques and Data Analysis
Ensemble Techniques and Data Analysis - Learning Outcomes
Bagging and Random Forest Techniques
Boosting Ensemble Technique
Outlier Treatment in Python
Ensemble Techniques and Data Analysis - Lesson Summary
Upon successful completion of this course, you should be able to:
- Describe the different types of string functions in Python.
- Outline the components of a comprehensive data dictionary.
- Explain the different types of decision trees.
- Discuss the methods of controlling the growth of decision trees.
- List the steps in the decision tree building process.
- Explain the three types of ensemble methods used in decision trees.
- Discuss the steps in the data exploration process.
- Explain the three methods of treating outliers.
- Describe the measures of central tendency.
All Alison courses are free to enrol, study and complete. To successfully complete this Certificate course and become an Alison Graduate, you need to achieve 80% or higher in each course assessment. Once you have completed this Certificate course, you have the option to acquire an official Certificate, which is a great way to share your achievement with the world. Your Alison Certificate is:
Ideal for sharing with potential employers - include it in your CV, professional social media profiles and job applications
An indication of your commitment to continuously learn, upskill and achieve high results
An incentive for you to continue empowering yourself through lifelong learning
Alison offers 3 types of Certificates for completed Certificate courses:
Digital Certificate - a downloadable Certificate in PDF format, immediately available to you when you complete your purchase
Certificate - a physical version of your officially branded and security-marked Certificate, posted to you with FREE shipping
Framed Certificate - a physical version of your officially branded and security-marked Certificate in a stylish frame, posted to you with FREE shipping
All Certificates are available to purchase through the Alison Shop. For more information on purchasing Alison Certificates, please visit our FAQs. If you decide not to purchase your Alison Certificate, you can still demonstrate your achievement by sharing your Learner Record or Learner Achievement Verification, both of which are accessible from your Dashboard. For more details on our Certificate pricing, please visit our Pricing Page.