deepblink.optimizers module

Optimizers are used to update weight parameters in a neural network.

The learning rate defines what stepsizes are taken during one iteration of training. This file contains functions to return standard or custom optimizers.

deepblink.optimizers.adam(learning_rate: float)[source]

Keras’ adam optimizer with a specified learning rate.

deepblink.optimizers.amsgrad(learning_rate: float)[source]

Keras’ amsgrad optimizer with a specified learning rate.

deepblink.optimizers.rmsprop(learning_rate: float)[source]

Keras’ rmsprop optimizer with a specified learning rate.