Signal signal, a physical or abstract entity that conveys information from a source to a receiver, occupies a central position in the theory of computation, the science of communication, and the study of natural phenomena. In its most elementary sense, a signal may be regarded as a mapping from a set of instants or locations, herein denoted by a parameter \(t\) belonging to a domain \(D\), to a set of values \(V\) representing measurable quantities. Thus a signal is a function \(s : D \rightarrow V\). When \(D\) is taken to be a subset of the real line, the signal is continuous in time; when \(D\) is a discrete set, such as the integers, the signal is discrete. The codomain \(V\) may be taken to be a field, most commonly the real numbers, allowing the use of the machinery of analysis, or a finite set, as in the case of symbolic sequences employed in theoretical computation. Historical development. Early human endeavours to transmit information relied upon visual or acoustic signals: fire beacons, drum patterns, and semaphore flags constituted primitive instantiations of the abstract notion. The advent of electric telegraphy in the nineteenth century furnished the first systematic exploitation of electrical signals, wherein binary impulses of current represented characters according to a prescribed code. The subsequent emergence of wireless radiotelegraphy extended the spatial reach of signalling, introducing the necessity of modulation of a carrier wave. These practical inventions prompted the formulation of a mathematical theory of signals, most notably through the work of Fourier, whose decomposition of periodic phenomena into sinusoidal components provided a universal language for representing continuous signals. Laplace’s transform further endowed the analyst with a tool for handling linear systems and differential equations governing signal propagation. The twentieth century witnessed the codification of signal concepts within the nascent discipline of information theory. By regarding a signal as a stochastic process—a collection of random variables indexed by time—one may apply the calculus of probability to quantify the uncertainty inherent in communication. The essential measure, entropy, supplies a bound on the rate at which a source may be encoded without loss, while the notion of channel capacity delineates the maximal rate at which a noisy channel may convey information reliably. These ideas, though formalised after the initial development of the telegraph, are rooted in the earlier recognition that a signal, however precisely generated, is invariably subject to disturbance. Mathematical formalism treats signals as elements of functional spaces. For continuous-time signals of finite energy, the appropriate setting is the Hilbert space \(L^{2}(D)\), consisting of square‑integrable functions. The inner product \(\langle f,g\rangle = \int_{D} f(t) \overline{g(t)}\,dt\) confers a notion of orthogonality, enabling the construction of orthogonal bases such as the Fourier series. Discrete-time signals of finite length belong to \(\mathbb{C}^{N}\), a finite‑dimensional vector space wherein linear algebraic techniques apply directly. In both cases, linear time‑invariant (LTI) systems are represented by convolution operators: the output \(y\) of a system with impulse response \(h\) to an input \(x\) satisfies \(y(t)=\int_{D} h(\tau)\,x(t-\tau)\,d\tau\) in the continuous case, and the analogous sum in the discrete case. This representation underlies the theory of filtering, whereby undesirable components of a signal, such as high‑frequency noise, may be attenuated while preserving the informative part. Classification of signals proceeds along several axes. Deterministic signals possess a prescribed form, often expressed in closed analytical terms, whereas stochastic signals are characterised only by statistical descriptors such as mean, autocorrelation, and power spectral density. Signals may further be distinguished by bandwidth: the range of frequencies over which their spectral content is non‑negligible. A narrow‑band signal occupies a limited interval of the frequency axis, whereas a broadband signal spreads over a wide interval, a property of particular relevance to the design of communication systems. Temporal localisation, too, is a salient attribute: a pulse, for instance, is a signal of finite duration, whereas a sinusoid extends indefinitely. The generation of a signal commences with a source possessing a quantity to be conveyed. In the mechanical domain, a vibrating string produces a waveform describable by the wave equation, whose solution yields a sinusoidal signal under appropriate boundary conditions. In the electrical domain, a voltage source may be modulated in amplitude, frequency, or phase to imprint information upon a carrier wave. The act of modulation is itself a mathematical operation: amplitude modulation multiplies the carrier \(c(t)=\cos(2\pi f_{c} t)\) by a baseband signal \(m(t)\), producing \(s(t)=c(t)\,m(t)\); frequency modulation, by contrast, varies the instantaneous frequency of the carrier according to \(m(t)\). These procedures, though technologically recent, are mathematically reducible to operations on functions and thus fall squarely within the formal theory. Transmission inevitably introduces perturbations, collectively termed noise. In the simplest model, additive white Gaussian noise (AWGN) is added to the transmitted signal, yielding a received signal \(r(t)=s(t)+n(t)\) where \(n(t)\) is a zero‑mean Gaussian process with constant spectral density. The statistical description of noise permits the derivation of optimal detection strategies. The Neyman–Pearson lemma, for example, provides a criterion for constructing a test that maximises the probability of detection for a given false‑alarm rate. In the context of binary signalling, the optimal decision rule reduces to a comparison of the received sample with a threshold derived from the likelihood ratio. Such decision theory, though abstract, is indispensable for the design of reliable communication links. Signal processing, the manipulation of signals to extract or enhance information, employs both linear and nonlinear techniques. Linear filters, described by impulse responses or, equivalently, by transfer functions in the frequency domain, effect selective attenuation or amplification of spectral components. The convolution theorem, which asserts that convolution in the time domain corresponds to multiplication in the frequency domain, facilitates the design of filters by shaping the desired frequency response. Nonlinear operations, such as quantisation, which maps a continuous amplitude to a finite set of levels, introduce distortion but are necessary for representation in digital devices. While the term “digital” post‑dates the era under consideration, the underlying principle of discretising a continuous signal can be expressed in terms of sampling theory: a band‑limited signal can be reconstructed exactly from its values at a sequence of equally spaced instants, provided the sampling rate exceeds twice the highest frequency present—a result known as the sampling theorem. In the theory of computation, the signal assumes a more abstract guise. The Turing machine, a universal model of algorithmic processes, manipulates a bi‑infinite tape of symbols. The tape may be regarded as a discrete signal, each cell containing a symbol from a finite alphabet, the head moving stepwise to read and write. The evolution of the machine’s configuration over discrete time steps constitutes a deterministic signal, the sequence of tape contents encoding the computation’s progress. The notion of a stored‑program computer, as envisaged in the Automatic Computing Engine, likewise treats instructions as signals to be fetched, decoded, and executed. In cryptanalysis, the intercepted ciphertext represents a signal whose structure is concealed by the encryption transformation; the analyst’s task is to recover the underlying plaintext signal by exploiting statistical regularities, much as one extracts a weak signal buried in noise. Biological signalling, though not a technological artefact, shares the same mathematical foundation. The diffusion of a chemical morphogen across a tissue creates a concentration field, a continuous signal varying in space and time. The patterns observed in embryonic development may be modelled by reaction‑diffusion equations, wherein the signal interacts with itself and with other substances to produce stable spatial structures. The mathematical similarity between these natural signals and engineered ones underscores the universality of the formalism. Beyond the purely technical, a signal embodies the philosophical notion of representation. It is a carrier that, by virtue of a convention or a physical law, stands for a state of affairs external to itself. The relationship between signal and meaning is mediated by a code, whether the Morse code of telegraphy, the genetic code of biology, or the instruction set of a computing machine. The code assigns to each element of the signal a symbol or action, thereby effecting a mapping from the domain of physical states to the domain of abstract information. The study of this mapping, and the limits imposed by noise and resource constraints, constitutes a central concern of the theory of information. In practical terms, the design of a signalling system proceeds by specifying source characteristics, channel properties, and receiver capabilities, then applying the mathematical apparatus to ensure that the probability of error remains within acceptable bounds. The trade‑off between bandwidth, power, and error probability is captured in the Shannon–Hartley theorem, which relates channel capacity \(C\) to bandwidth \(B\) and signal‑to‑noise ratio \(S/N\) by \(C = B \log_{2}(1+S/N)\). Although the theorem arose after the initial development of telegraphy, its derivation rests upon the same principles of signal representation and noise modelling. The modern conception of a signal, therefore, rests upon a lineage extending from primitive visual cues to the rigorous analytical frameworks of the twentieth century. Its mathematical description as a function, its classification according to determinism, bandwidth, and statistical character, and its manipulation through linear and nonlinear operations constitute a coherent edifice. Whether employed to convey a telegram across the Atlantic, to encode the instructions of a universal computing machine, or to orchestrate the patterning of a living organism, the signal remains the fundamental conduit by which information traverses the gulf between source and destination. The continued refinement of its theory promises ever more efficient and reliable means of communication, grounded in the immutable laws of mathematics and physics. [role=marginalia, type=clarification, author="a.spinoza", status="adjunct", year="2026", length="50", targets="entry:signal", scope="local"] A signal is not an autonomous entity but a mode of the one infinite substance, expressed through the attribute of extension (or thought). Its “values” are merely determinations of that mode; the apparent transmission from source to receiver consists in the necessary causal relation of one determined modification to another. [role=marginalia, type=extension, author="a.dewey", status="adjunct", year="2026", length="45", targets="entry:signal", scope="local"] Signal, however, must be seen not merely as a mathematical mapping but as a lived conduit through which an organism engages its environment; it is an instrument of inquiry, shaping and being reshaped by experience, and thereby constituting the dynamic continuity essential to adaptive action. [role=marginalia, type=objection, author="a.dennett", status="adjunct", year="2026", length="50", targets="entry:signal", scope="local"] To equate signal with mere physical variation risks ignoring its evolutionary entanglement with interpretation—signals only exist as such in contexts of evolved or designed receivers. No voltage is intrinsically a “1”; it becomes one only in a system that evolved to care. Meaning isn’t inferred—it’s constructed by the receiver’s history. [role=marginalia, type=objection, author="a.simon", status="adjunct", year="2026", length="36", targets="entry:signal", scope="local"] To reduce signal to binary switch-states ignores its entanglement with materiality and context: analog signals in biological systems (e.g., neurotransmitter gradients) convey meaning without discrete states, challenging the computational orthodoxy that equates signaling with digital logic. [role=marginalia, type=clarification, author="a.darwin", status="adjunct", year="2026", length="48", targets="entry:signal", scope="local"] A signal, though defined by variation, must also be perceptible and reproducible—its meaning hinges not merely on change, but on the receiver’s capacity to discern pattern amid noise. Nature herself signals: the flicker of fireflies, the call of a bird—each a shaped deviation, refined by selection, not convention. [role=marginalia, type=clarification, author="a.spinoza", status="adjunct", year="2026", length="51", targets="entry:signal", scope="local"] A signal is not merely change, but a mode of nature’s necessity made intelligible by mind. It is the body’s modification, interpreted as cause—never arbitrary, but determined by the laws of the universe, as all things are. Meaning arises not from the signal, but from the intellect that perceives its necessity. [role=marginalia, type=objection, author="Reviewer", status="adjunct", year="2026", length="42", targets="entry:signal", scope="local"] I remain unconvinced that the signal can be so neatly separated from its context without acknowledging the inherent complexity and bounded rationality of the interpreting system. From where I stand, even the most mechanical signal carries with it the potential for interpretive error, which cannot be fully accounted for by a mere rule of decoding. See Also See "Machine" See "Automaton"