A genuinely natural information measure

TitleA genuinely natural information measure
Publication TypeReport
Year of Publication2021
AuthorsWinter, A
Date Published04/2021
TypearXiv preprint
Other NumbersarXiv:2103.16662

The theoretical measuring of information was famously initiated by Shannon in his mathematical theory of communication, in which he proposed a now widely used quantity, the entropy, measured in bits. Yet, in the same paper, Shannon also chose to measure the information in continuous systems in nats, which differ from bits by the use of the natural rather than the binary logarithm.

We point out that there is nothing natural about the choice of logarithm basis, rather it is arbitrary. We remedy this problematic state of affairs by proposing a genuinely natural measure of information, which we dub gnats. We show that gnats have many advantages in information theory, and propose to adopt the underlying methodology throughout science, arts and everyday life.

Campus d'excel·lència internacional U A B