Batch Gradient Descent
Batch Gradient Descent is an optimization algorithm used in machine learning to minimize the loss function. It calculates the gradient of the loss function for the entire training dataset in each iteration. This means that all training examples are used to update the model's parameters, which can lead to more stable convergence.
However, because it processes the entire dataset at once, Batch Gradient Descent can be slow and memory-intensive, especially with large datasets. As a result, it may not be practical for real-time applications, leading to the development of alternatives like Stochastic Gradient Descent and Mini-Batch Gradient Descent.