Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Learning Neural Networks with Tensorflow
The Dataset Driven Approach to Building Neural Networks with TensorFlow
The Course Overview (4:26)
Solving Public Datasets (2:02)
Why We Use Docker and Installation Instructions (2:52)
Our Code, in a Jupyter Notebook (5:07)
Understanding TensorFlow (14:05)
The Iris Dataset – Your First Neural Network
The Iris Dataset (6:05)
The Human Brain and How to Formalize It (11:46)
Backpropagation (12:04)
Overfitting - Why We Split Our Train and Test Data (9:34)
Predicting the Ground Energy State of Molecules
Ground State Energies of 16,242 Molecules (7:31)
First Approach - Easy Layer Building (10:17)
Preprocessing Data (10:03)
Understanding the Activation Function (10:21)
The Importance of Hyperparameters (8:25)
Recognizing Written Digits with the MNIST Dataset
Images of Written Digits (6:22)
Dense Layer Approach (6:54)
Convolution and Pooling Layers (11:35)
Convolution and Pooling Layers (Continued) (7:26)
From Activations to Probabilities – the Softmax Function (4:55)
Optimization and Loss Functions (10:26)
Analyzing Celebrity Faces
Large-Scale CelebFaces Attributes (CelebA) Dataset (8:10)
Building an Input Pipeline in TensorFlow (11:20)
Building a Convolutional Neural Network (9:01)
Batch Normalization (7:42)
Understanding What Your Network Learned –Visualizing Activations (15:57)
Dense Layer Approach
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock