Skip to contents

The Shannon entropy of a vector is calculated as: $$H(X) := -\sum_{x \in \chi} p(x) \log p(x) = E[-\log p(x)]$$

Usage

entropy_shannon(v)

Arguments

v

A vector of values

Value

The Shannon entropy of the vector

Details

When calculating the probabilities, it is assumed by convention that \((0\cdot \log 0 = 0)\) and \((1 \cdot \log 1 = 1)\).

Examples

v <- sample(1:10, 50, TRUE, rep(0.1, 10))
entropy_shannon(v)
#> [1] 3.088767