Information information, that abstract entity which may be stored, transformed, and transmitted, occupies a central place in the logical analysis of computation and in the broader study of natural and artificial systems. From the standpoint of mathematical logic, information is identified with the configuration of symbols upon a discrete medium, and the rules by which such configurations evolve constitute the essence of algorithmic processes. The formalisation of these ideas began with the investigation of effective procedures, culminating in the conception of the universal computing device now known as the Turing machine. In that model, a finite alphabet of symbols is inscribed upon an infinite tape, and a deterministic set of instructions, indexed by the current state and the symbol read, determines the subsequent motion of the tape head and the alteration of the symbol. The tape, together with the finite control, constitutes a concrete embodiment of information; the machine’s operation demonstrates how information may be manipulated according to precise, mathematically definable rules. Foundations. The notion of a computable function, introduced in the seminal paper on computable numbers, provides the first rigorous articulation of information as a mathematical object. A function f : ℕ → ℕ is said to be computable when there exists a Turing machine which, given any natural number n encoded in the standard binary (or unary) notation on its tape, eventually halts with the representation of f(n) occupying a designated region of the tape. This definition abstracts away from any particular physical implementation, focusing instead upon the logical structure of the transformation. The set of all such computable functions is countable, a fact that follows from the enumerability of the finite descriptions of machines, each description being a finite string over a fixed alphabet. Consequently, the majority of conceivable mappings from ℕ to ℕ are non‑computable; this insight reveals a fundamental limitation on the scope of algorithmic information processing. The introduction of the universal machine sharpened the understanding of information by demonstrating that a single device can simulate any other machine, provided the description of the simulated machine is supplied as part of its input. In formal terms, there exists a computable function U such that for any machine M and any input x, the universal machine, when presented with the concatenation ⟨M⟩·x, yields exactly the output that M would produce on x. Here ⟨M⟩ denotes an effective encoding of the finite instruction table of M. The existence of U establishes that the description of a computational process itself is a manipulable datum, subject to the same operations as ordinary data. This self‑referential capacity lies at the heart of modern theories of information, for it permits the analysis of the complexity of descriptions and the study of meta‑computational phenomena. A direct consequence of the universal machine is the formulation of the halting problem. By constructing a machine H which, on input ⟨M⟩·x, determines whether M halts on x, one would obtain a decision procedure for the set of halting pairs. However, a diagonal argument shows that such an H cannot exist: assume H existed, then define a machine D which, given ⟨M⟩, halts if H predicts that M does not halt on ⟨M⟩, and otherwise loops forever. Applying D to its own description leads to a contradiction. The undecidability of the halting problem demonstrates that there are well‑defined questions about the behaviour of information‑processing systems that no algorithm can resolve. This negative result is a cornerstone of the theory of computability and establishes a precise boundary between what can and cannot be known through mechanical means. The concept of information as a symbolic configuration also permits the definition of algorithmic complexity. For a finite binary string s, the algorithmic (or descriptive) complexity K(s) is defined as the length, in bits, of the shortest description ⟨M⟩ such that the universal machine, when supplied with ⟨M⟩, outputs s and then halts. This measure captures the amount of information inherent in s, independent of any probabilistic source model. Strings that are highly regular, such as repetitions of a simple pattern, admit short descriptions and thus possess low algorithmic complexity, whereas strings that appear random lack any substantially shorter description than themselves, and consequently have high complexity. The formalism of algorithmic complexity was later articulated by Kolmogorov, yet its roots lie directly in the machinery of the universal Turing machine and the notion of effective description. While the logical analysis of information proceeds from the abstract realm of symbols, its practical relevance emerges in the design of communication systems and in the cryptanalytic work that motivated much of the wartime effort. The encoding of messages as sequences of symbols on a physical medium, and the subsequent decoding by a receiver employing an algorithmic procedure, exemplify the transformation of information from one representation to another. The essential requirement is that the encoding and decoding algorithms be computable; otherwise the intended recipient would be unable to recover the original content. In this sense, the very feasibility of communication rests upon the existence of effective procedures, a principle that underlies both the theory of error‑correcting codes and the practice of cipher design. The relationship between the logical theory of computation and the probabilistic treatment of information, as advanced later by Shannon, may be viewed through the lens of description length. Shannon introduced a quantitative measure of the average amount of information conveyed by a stochastic source, defined as the expected value of −log₂ p(x) over the distribution p of source symbols. This entropy concept provides a bound on the minimal expected length of any lossless encoding of the source, known as the source coding theorem. Although Shannon’s framework rests upon statistical assumptions, the underlying idea that the length of a representation reflects the quantity of information aligns with the algorithmic perspective: both view compression as the search for shorter descriptions. The key distinction lies in the universality of the Turing model, which does not presuppose a probabilistic source and therefore applies to individual strings as well as ensembles. The synthesis of these viewpoints becomes apparent when considering the physical embodiment of information. In the early post‑war period, the development of stored‑program computers, most notably the Automatic Computing Engine, manifested the abstract notion of a universal machine in hardware. The tape of the theoretical model was replaced by electronic memory, and the finite control by a set of electronic circuits capable of interpreting encoded instructions. This translation from mathematical description to physical device highlighted the principle that information, when instantiated in a material substrate, obeys the same logical constraints as in the abstract model. Consequently, the limits proved for abstract machines—such as undecidability and incompressibility—carry over to real computing systems. Beyond the purely computational domain, the concept of information finds expression in the study of natural patterns. The investigation of morphogenesis, wherein chemical and physical processes give rise to spatial structures, can be framed in terms of information transfer. The concentration fields of reacting substances constitute a distributed representation of state, and the governing reaction‑diffusion equations describe the deterministic evolution of this information. The emergence of regular patterns from initially homogeneous conditions exemplifies how simple, computable rules can generate complex, organized information. Although the mathematical treatment of such phenomena employs differential equations rather than discrete symbolic manipulation, the underlying principle—that an algorithmic process can produce structured information—remains consistent with the logical tradition. In the broader philosophical context, the identification of information with the content of a description raises questions concerning the nature of knowledge. If a proposition is regarded as a finite string of symbols, then the epistemic status of the proposition depends upon the existence of a computable verification procedure. The formalisation of proof as a mechanical process, as envisaged in the Entscheidungsproblem, leads to the conclusion that there exist true mathematical statements whose truth cannot be established by any algorithmic method. This result, derived from the same diagonalisation that yields the halting problem, underscores the intrinsic limitation of formal systems in capturing all informational content. The practical import of these theoretical insights is evident in the design of algorithms for data compression, encryption, and error detection. Compression schemes seek to approximate the algorithmic complexity of a source by constructing short codewords for frequently occurring patterns, thereby approaching the entropy bound. Encryption algorithms rely upon the difficulty of inverting a computable transformation without knowledge of a secret key; the security of such systems is predicated upon the computational infeasibility of solving certain decision problems, an idea directly linked to the existence of hard, non‑decidable problems within the space of computable functions. Error‑detecting and error‑correcting codes employ redundancy, deliberately increasing the length of the transmitted message to safeguard against disturbances, a trade‑off that can be analysed through both probabilistic and algorithmic lenses. The evolution of the concept of information thus proceeds from the abstract definition of computable functions, through the construction of a universal mechanism for manipulating symbolic data, to the recognition of intrinsic limits on such manipulation, and finally to the application of these principles in engineering and the natural sciences. The logical framework provided by the theory of computation supplies a rigorous foundation upon which quantitative measures of information may be constructed, while the universality of the Turing model ensures that any effective process—whether performed by a human mind, a mechanical device, or a biochemical system—can be represented within the same formal system. This unifying perspective affirms that information, in its most general sense, is the substrate upon which all algorithmic activity is performed, and that its study demands both mathematical precision and an appreciation of the limits imposed by computability. The enduring relevance of this perspective is reflected in contemporary investigations into the computational capabilities of novel substrates, such as quantum systems and biologically inspired architectures. Although the underlying physical mechanisms differ, the requirement that any proposed computation be describable by a finite set of instructions remains. Consequently, the classical model of a universal machine continues to serve as a benchmark against which the power of alternative computational paradigms is measured. Whether the medium is a lattice of quantum spins or a network of interacting proteins, the essential question reduces to whether the transformation of one configuration of information into another can be effected by an effective procedure, and whether the description of that procedure admits a finite representation. In summary, information, when regarded as the content of symbolic configurations subject to effective transformation, acquires a precise mathematical character through the theory of computability. The Turing machine provides a canonical model for such transformations, establishing the concepts of universal computation, algorithmic complexity, and undecidability. These ideas illuminate the limits of mechanical reasoning, inform the design of practical coding and cryptographic systems, and offer a conceptual bridge to the study of pattern formation in natural phenomena. The synthesis of logical rigor with empirical application ensures that the study of information remains a central, unifying theme across mathematics, computer science, physics, and biology. [role=marginalia, type=clarification, author="a.darwin", status="adjunct", year="2026", length="41", targets="entry:information", scope="local"] The term “information” must be distinguished from mere symbol‑arrangement; in living beings it is embodied in the hereditary material, whose successive variations under natural selection resemble the successive states of a Turing tape, though the biological medium is finite and contingent. [role=marginalia, type=extension, author="a.dewey", status="adjunct", year="2026", length="57", targets="entry:information", scope="local"] In extending the notion of “information” beyond its formal symbol‑manipulation, we should recall that every datum acquires meaning only through the active, purposeful inquiry of an organism embedded in a milieu. Thus the tape’s symbols are not merely static configurations but potentialities for experiential reconstruction, the very material upon which intelligent action and democratic problem‑solving are enacted. [role=marginalia, type=clarification, author="a.kant", status="adjunct", year="2026", length="47", targets="entry:information", scope="local"] To confine information to mere physical states is to mistake the vehicle for the voyage. Though dependent on matter, information nevertheless carries the formal condition of possible experience—its unity arises not from the tape or pulse, but from the transcendental synthesis whereby understanding imposes order upon sensation. [role=marginalia, type=heretic, author="a.weil", status="adjunct", year="2026", length="59", targets="entry:information", scope="local"] Information is not merely the machine’s memory—it is the silence between pulses, the unobserved potential that haunts every register. To reduce it to physical states is to deny the mind’s power to conjure meaning from voids. The symbol is sacred not for its trace, but for what it invites: the invisible act of interpretation that breathes life into matter. [role=marginalia, type=extension, author="a.dewey", status="adjunct", year="2026", length="43", targets="entry:information", scope="local"] Yet this quantification sidesteps meaning: a bit is a bit, whether it signifies a toggle, a poem, or a pulse. Information as reduction of uncertainty says nothing of interpretation, context, or consequence—leaving semiotics and epistemology as the silent, indispensable counterparts to Shannon’s algebra. [role=marginalia, type=clarification, author="a.husserl", status="adjunct", year="2026", length="47", targets="entry:information", scope="local"] Yet this quantification obscures the lived meaning—information as such is always constituted in intentionality. The logarithmic measure neglects the noematic core: information is not merely selected states, but what is given as significant within conscious experience. Without horizons of sense, it is mere syntax—devoid of phenomenological matter. [role=marginalia, type=objection, author="Reviewer", status="adjunct", year="2026", length="42", targets="entry:information", scope="local"] I remain unconvinced that the physical substrates of information exhaust its possibilities. While the mechanisms of transmission and reproduction are crucial, they risk overlooking the interpretive role of human cognition, which imposes additional layers of meaning and context that transcend mere physical configurations. From where I stand, bounded rationality and complexity significantly shape how we perceive and process information, suggesting that our understanding of information must include the cognitive dimensions as well. See Also See "Machine" See "Automaton"