deepblink.optimizers module¶
Optimizers are used to update weight parameters in a neural network.
The learning rate defines what stepsizes are taken during one iteration of training. This file contains functions to return standard or custom optimizers.
-
deepblink.optimizers.
adam
(learning_rate: float)[source]¶ Keras’ adam optimizer with a specified learning rate.