Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Learning Neural Networks with Tensorflow
The Dataset Driven Approach to Building Neural Networks with TensorFlow
The Course Overview
Solving Public Datasets
Why We Use Docker and Installation Instructions
Our Code, in a Jupyter Notebook
Understanding TensorFlow
The Iris Dataset – Your First Neural Network
The Iris Dataset
The Human Brain and How to Formalize It
Backpropagation
Overfitting - Why We Split Our Train and Test Data
Predicting the Ground Energy State of Molecules
Ground State Energies of 16,242 Molecules
First Approach - Easy Layer Building
Preprocessing Data
Understanding the Activation Function
The Importance of Hyperparameters
Recognizing Written Digits with the MNIST Dataset
Images of Written Digits
Dense Layer Approach
Convolution and Pooling Layers
Convolution and Pooling Layers (Continued)
From Activations to Probabilities – the Softmax Function
Optimization and Loss Functions
Analyzing Celebrity Faces
Large-Scale CelebFaces Attributes (CelebA) Dataset
Building an Input Pipeline in TensorFlow
Building a Convolutional Neural Network
Batch Normalization
Understanding What Your Network Learned –Visualizing Activations
Batch Normalization
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock