E ISSN: 2583-049X
logo

International Journal of Advanced Multidisciplinary Research and Studies

Volume 6, Issue 1, 2026

Claude Shannon Starts the Information Age



Author(s): John H Jennings

Abstract:

When Claude Shannon wrote A MATHEMATICAL THEORY OF COMMUNICATION [1], it was proved there that information entropy, H, had to be.

H = - K Σ p(x) log p(x)

Where K is due to a choice of a unit of measure. For the case of two possibilities, we have:

H = - (p log p + q log q)

Where q = 1 – p. The entropy for that is a maximum when p = 0.5, or the most uncertain situation. Shannon called the information content of a data source “entropy” [2] and the information unit was the “bit.” This created information theory as a new field. Shannon entropy occurs in the so-called relative entropy of coherence [3]. Shannon was very much at the start of artificial intelligence when he invented Theseus, the artificially intelligent mouse that could learn the path through a maze and even deal with the changed maze, where the barriers had been moved [4]. Jesus Christ is going to solve the problem of entropy, which has been death [5].


Keywords: Claude Shannon, Artificial Intelligence (AI)

Pages: 1306-1307

Download Full Article: Click Here