Data Forest logo
Home page  /  Glossary / 
Cross-entropy

Cross-entropy

Cross-entropy is a measure of the difference between two probability distributions for a given random variable or set of events. It quantifies the amount of information needed to explain one distribution using the other. In machine learning, cross-entropy is commonly used as a loss function for classification tasks, particularly in logistic regression and neural networks. It measures the performance of a model whose output is a probability value between 0 and 1. Lower cross-entropy indicates better model performance, as it means the predicted probability distribution is closer to the actual distribution.

Data Science
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Latest publications

All publications
Article image preview
September 26, 2024
19 min

Data Analytics Puts the Correct Business Decisions on Conveyor

Clear Project Requirements: How to Elicit and Transfer to a Dev Team
September 26, 2024
12 min

Clear Project Requirements: How to Elicit and Transfer to a Dev Team

Prioritizing MVP Scope: Working Tips and Tricks
September 26, 2024
15 min

Prioritizing MVP Scope: Working Tips and Tricks

All publications
top arrow icon