|
Entropy
Name |
Entropy |
Description |
Given
- a finite set E of classified examples and
- a set C = {c1, ..., cn} of possible classes.
Let |Eci| denote the number of examples e ∈ E of class
ci and let pi := |Eci| / |E|.
Then the Entropy of E is defined as:
entropy(E) := - |
∑ |
pi * log2(pi). |
|
i=1,...,n |
|
In this context we define 0 * log2(0) = 0.
The Entropy measures the impurity of a set of examples.
It is lowest, if there is at most one class present, and it is highest,
if the proportions of all present classes are equal. |
|
|