We have seen earlier the entire purpose of to optimize weights so as to reduce loss. Let’s see in detail different optimization functions we can use with neural networks
What are Optimizes?
We have seen an optimizer already in our Linear Regression blog called “SGD”. As described earlier the purpose of a optimizer is to update weights of the NN in order to reduce the loss function.
There are many different types of optimizer available in Keras. Its important to know what they are and there pro/cons https://keras.io/optimizers/
But in the current case we will use an optimizer called “Adam”. To use this in our NN we do this simply
# blog related to optmizer
from keras.optimizers import Adam
model.compile(Adam(lr=.0001), ....... )
To understand different types of optimizes i recommend reading this
https://hackernoon.com/some-state-of-the-art-optimizers-in-neural-networks-a3c2ba5a5643
Also, to understand in general optimizes read this further