Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Data Science: Supervised Machine Learning in Python
Introduction and Review
Introduction and Outline (9:33)
Special Offer! Get the VIP version of this course (1:14)
Review of Important Concepts (3:27)
Where to get the Code and Data (2:09)
How to Succeed in this Course (3:13)
K-Nearest Neighbor
K-Nearest Neighbor Concepts (5:02)
KNN in Code with MNIST (7:41)
When KNN Can Fail (3:49)
KNN for the XOR Problem (2:05)
KNN for the Donut Problem (2:36)
Naive Bayes and Bayes Classifiers
Naive Bayes (9:00)
Naive Bayes Handwritten Example (3:28)
Naive Bayes in Code with MNIST (5:56)
Non-Naive Bayes (4:04)
Bayes Classifier in Code with MNIST (2:03)
Linear Discriminant Analysis (LDA) and Quadratic Discriminant Analysis (QDA) (6:07)
Generative vs Discriminative Models (2:47)
Decision Trees
Decision Tree Basics (4:58)
Information Entropy (3:58)
Maximizing Information Gain (7:58)
Choosing the Best Split (4:02)
Decision Tree in Code (13:10)
Perceptrons
Perceptron Concepts (7:07)
Perceptron in Code (5:26)
Perceptron for MNIST and XOR (3:16)
Perceptron Loss Function (4:01)
Practical Machine Learning
Hyperparameters and Cross-Validation (4:15)
Feature Extraction and Feature Selection (3:54)
Comparison to Deep Learning (4:40)
Multiclass Classification (3:20)
Sci-Kit Learn (9:02)
Regression with Sci-Kit Learn is Easy (5:50)
Building a Machine Learning Web Service
Building a Machine Learning Web Service Concepts (4:11)
Building a Machine Learning Web Service Code (6:12)
Conclusion
What’s Next? Support Vector Machines and Ensemble Methods (e.g. Random Forest) (2:50)
Appendix
How to install Numpy, Scipy, Matplotlib, and Sci-Kit Learn (17:22)
How to Code by Yourself (part 1) (15:54)
How to Code by Yourself (part 2) (9:23)
How to Succeed in this Course (Long Version) (10:24)
BONUS: Where to get discount coupons and FREE deep learning material (5:31)
How to Code by Yourself (part 2)
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock