Step Decay
Step Decay is a learning rate schedule used in machine learning to adjust the learning rate during training. It involves reducing the learning rate by a fixed factor at specific intervals or "steps." This helps the model converge more effectively by allowing larger updates initially and smaller updates as training progresses.
For example, if the initial learning rate is set to 0.1, it might be reduced to 0.01 after a certain number of epochs. This gradual decrease helps prevent overshooting the optimal solution, improving the overall performance of algorithms like Stochastic Gradient Descent and enhancing the training of neural networks.