Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Deep Learning with TensorFlow 2.0
Introduction
Welcome to Machine Learning (0:30)
Welcome to the Course
What does the course cover (3:07)
Neural networks Intro
Introduction to neural networks (4:09)
Training the model (2:54)
Types of machine learning (3:43)
The linear model (3:08)
The linear model. Multiple inputs. (2:25)
The linear model. Multiple inputs and multiple outputs (4:25)
Graphical representation (1:47)
The objective function (1:27)
L2-norm loss (2:04)
Cross-entropy loss (3:55)
One-parameter gradient descent (6:33)
N-parameter gradient descent (6:08)
Download the Course Notes
Setting up the environment
Setting up the environment - Do not skip, please! (0:50)
Why Python and why Jupyter (4:53)
Installing Anaconda (3:03)
Jupyter Dashboard - Part 1 (2:27)
Jupyter Dashboard - Part 2 (5:14)
Installing TensorFlow 2.0 (5:02)
Download Shortcuts for Jupyter
Installing packages - exercise
Installing packages - solution
Minimal example
Outline (3:06)
Generating the data (optional) (4:58)
Initializing the variables (3:25)
Training the model (8:15)
Exercises
Introduction to TensorFlow 2
TensorFlow outline (3:28)
TensorFlow 2 intro (2:32)
A note on coding in TensorFlow (0:58)
Types of file formats in TensorFlow and data handling (2:34)
Model layout - inputs, outputs, targets, weights, biases, optimizer and loss (5:48)
Interpreting the result and extracting the weights and bias (4:09)
Customizing your model (2:51)
Exercises
Deep nets overview
The layer (1:52)
What is a deep net (2:18)
Really understand deep nets (4:58)
Why do we need non-linearities (2:59)
Activation functions (3:37)
Softmax activation (3:23)
Backpropagation (3:12)
Backpropagation - intuition (3:02)
Download the Course Notes
Backpropagation Mathematics (optional)
Backpropagation mathematics
Overfitting
Underfitting and overfitting (3:51)
Underfitting and overfitting. A classification example (1:52)
Train vs validation (3:22)
Train vs validation vs test (2:30)
N-fold cross validation (3:07)
Early stopping - motivation and types (4:54)
Initialization
Initializaiton (2:32)
Types of simple initializations (2:47)
Xavier's initialization (2:45)
Optimizers
SGD&Batching (3:24)
Local minima pitfalls (2:02)
Momentum (2:30)
Learning rate schedules (4:25)
Learning rate schedules. A picture (1:32)
Adaptive learning schedules (4:08)
Adaptive moment estimation (2:39)
Preprocessing
Preprocessing (2:51)
Basic preprocessing (1:17)
Standardization (4:31)
Dealing with categorical data (2:15)
One-hot vs binary (3:39)
Deeper example
The dataset (2:25)
How to tackle the MNIST (2:44)
Importing the relevant libraries and loading the data (2:11)
Preprocess the data - create a validation dataset and scale the data (4:43)
Preprocess the data - scale the test data
Preprocess the data - shuffle and batch the data (6:30)
Preprocess the data - shuffle and batch the data
Outline the model (4:54)
Select the loss and the optimizer (2:05)
Learning (5:38)
MNIST - Exercises
MNIST - Solutions
Testing the model (3:56)
Business case
Exploring the dataset and identifying predictors (7:54)
Outlining the business case solution (1:31)
Balancing the dataset (3:39)
Preprocessing the data (11:32)
Preprocessing exercise
Load the preprocessed data (3:23)
Load the preprocessed data - Exercise
Learning and interpreting the result (4:15)
Setting an early stopping mechanism (5:01)
Setting an early stopping mechanism - Exercise
Testing the model (1:23)
Final exercise
Conclusion
Summary (3:41)
Whats more out there (1:47)
An overview of CNNs (4:55)
An overview of RNNs (2:50)
Non-NN approaches (3:52)
Graphical representation
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock