Cross-entropy is a measure of the difference between two probability distributions for a given random variable or set of events. It quantifies the amount of information needed to explain one distribution using the other. In machine learning, cross-entropy is commonly used as a loss function for classification tasks, particularly in logistic regression and neural networks. It measures the performance of a model whose output is a probability value between 0 and 1. Lower cross-entropy indicates better model performance, as it means the predicted probability distribution is closer to the actual distribution.