Neural Network Mathematics: Exploring the Math behind Gradient Descent

placeholder

Because neural networks comprise thousands of neurons and interconnections one can assume training a neural network involves millions of computations. This is where a general-purpose optimization algorithm called gradient descent comes in. Use this course to gain an intuitive and visual understanding of how gradient descent and the gradient vector work. As you advance examine three neural network activation functions ReLU sigmoid and hyperbolic tangent functions and two variants of the ReLU function Leaky ReLU and ELU. In examining variants of the ReLU activation function learn how to use them to deal with deep neural network training issues. Finally implement a neural network from scratch using TensorFlow and basic Python. When youre done youll be able to illustrate the mathematical intuition behind neural networks and be prepared to tackle more complex machine learning problems.