Data Forest logo
Home page  /  Glossary / 
Cross-entropy

Cross-entropy

Cross-entropy is a measure of the difference between two probability distributions for a given random variable or set of events. It quantifies the amount of information needed to explain one distribution using the other. In machine learning, cross-entropy is commonly used as a loss function for classification tasks, particularly in logistic regression and neural networks. It measures the performance of a model whose output is a probability value between 0 and 1. Lower cross-entropy indicates better model performance, as it means the predicted probability distribution is closer to the actual distribution.

Data Science
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Latest publications

All publications
Preview article image
October 4, 2024
18 min

Web Price Scraping: Play the Pricing Game Smarter

Article image preview
October 4, 2024
19 min

The Importance of Data Analytics in Today's Business World

Generative AI for Data Management: Get More Out of Your Data
October 2, 2024
20 min

Generative AI for Data Management: Get More Out of Your Data

All publications
top arrow icon