Shannon’s entropy is a measure of the uncertainty of a system, or equivalently the amount of information present in it. It is often measured in bits, and, for a system with possible states, and probabilities described by the vector , the entropy is given by

Von Neumann Entropy generalises Shannon’s Entropy.