Cross Entropy
Cross Entropy is a measure from information theory that quantifies the difference between two probability distributions. It is commonly used in machine learning, particularly in classification tasks, to evaluate how well a predicted probability distribution aligns with the actual distribution of the data. Lower cross entropy values indicate better model performance.
In practical terms, when training models like neural networks, cross entropy serves as a loss function. It helps guide the optimization process by penalizing incorrect predictions more heavily, thus encouraging the model to improve its accuracy over time. This makes it a crucial component in tasks such as image recognition and natural language processing.