The Hartley (Hart): Logarithmic Information Unit in Base-10 Communication Theory

The Hartley, abbreviated Hart or sometimes dit, measures information as the logarithm base 10 of the number of equally likely messages in a system. Named after Ralph V. L. Hartley’s 1928 paper on transmission of information, one Hartley corresponds to log10(10) = 1, meaning that ten equiprobable symbols convey one Hartley of information. While modern digital systems prefer the bit and byte rooted in base 2, Hartleys persist in multi-level communication, spectroscopy, and pedagogy for emphasising logarithmic intuition across different bases.

Definition and Mathematical Relationships

Hartley defined information quantity H as H = log10(N), where N represents the number of distinct equally likely messages. Because logarithms in different bases are proportional, Hartleys relate to bits via HHart = log10(N) = log2(N) / log2(10) ≈ 0.3010 × log2(N). Consequently, 1 Hartley ≈ 3.3219 bits. The logarithm base conversion calculator automates these conversions, ensuring consistent notation when switching between engineering textbooks and computer science references.

Hartleys also relate to natural units like the neper through ln(N) = log10(N) × ln(10). Information expressed in nepers, Hartleys, or bits all describe the same quantity; only the base differs. When dealing with weighted probabilities pi rather than equiprobable outcomes, Hartley’s measure generalises to the Shannon entropy in base 10: H = −Σ pi log10(pi), still reported in Hartleys.

Many calculators in the site’s information category, including the bits required for a number tool, output results in bits. Dividing by log2(10) converts the result to Hartleys, offering a cross-check for base-agnostic analysis.

Historical Development

Hartley’s 1928 framework

Hartley proposed measuring information content by counting the number of possible symbol sequences and applying a logarithm to ensure additivity across independent selections. Using base 10 aligned with common engineering slide rules and the decimal representation pervasive in telecommunications at the time. Hartley’s law of information anticipated later formalism by Claude Shannon, who generalised the concept to arbitrary probability distributions and popularised the bit as the standard unit.

Adoption and decline in mainstream practice

As binary-coded systems dominated mid-twentieth-century computing, the bit supplanted the Hartley in most applications. Nevertheless, base-10 logs persisted in instrumentation, where analogue dial readings were often linear in decades. Early telephone and telegraph engineering texts, including Hartley’s own works at Bell Labs, used Hartleys to describe channel capacity, emphasising decimal digits transmitted per symbol interval. Historical understanding helps interpret archival measurements that predate Shannon’s bit-centric approach.

Modern niches

Hartleys remain relevant in optical spectroscopy and astronomical photometry, where intensities span many decades and base-10 logarithms align with instrumentation calibrations. The decibel article discusses similar base-10 scaling for power ratios, highlighting conceptual continuity between Hartleys and other engineering logarithmic units.

Conceptual Foundations and Comparisons

Hartleys versus bits and trits

Hartleys quantify the number of decimal digits of precision delivered by a message. In contrast, bits measure binary digits and trits measure ternary digits. Because Hartleys equal log10(2) ≈ 0.3010 Hart per bit, converting between units clarifies the cost of encoding decimal data on binary hardware. The base64 encoded size calculator exemplifies how varying alphabet sizes affect storage overhead—calculations easily reframed in Hartleys to emphasise decimal significance.

Logarithmic measures in instrumentation

Many sensors, such as mass spectrometers and photomultiplier tubes, present readings on logarithmic scales. When instrument manuals specify decades of dynamic range, converting to Hartleys communicates how many decimal digits of information the instrument resolves. Cross-referencing the data transfer rate article shows how symbol alphabets and sampling interact to produce those decades of resolution.

Entropy in different bases

In thermodynamics and statistical mechanics, entropy is typically measured in natural units (J·K−1) or nats. Converting to Hartleys via division by ln(10) yields decimal-digit interpretations of disorder. This perspective aids interdisciplinary communication between physicists and information scientists comparing ISO 80000-13 terminology and more traditional physical chemistry notation.

Worked example: decimal-coded control system

Suppose an industrial controller transmits status updates using an alphabet of 100 symbols—ten digits combined with ten lettered modifiers. The Hartley information per update equals log10(100) = 2 Hart. To determine the binary payload required for a retrofit, multiply by log10(e)/log2(e) to obtain roughly 6.6438 bits. Feeding the symbol count into the bits required for a number calculator and then dividing by log2(10) cross-checks the result. Presenting both Hartleys and bits in documentation keeps decimal-oriented technicians and binary-focused software teams aligned on capacity expectations.

Applications

Communications engineering

Hartley’s original channel capacity formula C = B log10(1 + S/N) expressed capacity in decimal digits per second for a bandwidth B and signal-to-noise ratio S/N. While Shannon reformulated the equation in bits, engineers occasionally revert to Hartleys when designing decimal-coded control systems or when reporting metrics on log-decade graphs. Calculators such as the live streaming bandwidth tool combine throughput, compression, and symbol sets—components readily interpretable in Hartleys. Bayesian analysts extend base-10 thinking to evidence accumulation by converting likelihood ratios into bans and decibans, reinforcing how Hartley-style logarithms connect communication theory with decision science.

Spectroscopy and chemometrics

In spectroscopy, absorbance is often reported in base-10 logarithms. Expressing detection limits in Hartleys clarifies how many decimal digits of concentration change an instrument can resolve. Similarly, chemometric models using principal component analysis may quantify explained variance per decade, aligning naturally with Hartley-based entropy measures.

Education and knowledge representation

Teaching logarithms benefits from Hartleys because they map directly to decimal intuition. Students can relate one Hartley to choosing a number between 0 and 9, inclusive, reinforcing probability fundamentals. The bits-required calculator offers a hands-on exercise: compute the bits for a 6-digit PIN, then divide by log2(10) to reveal the Hartleys.

Importance and Continuing Relevance

Even as binary encoding dominates digital infrastructure, the Hartley endures as a pedagogical and analytical bridge between decimal intuition and binary implementation. It underscores that information measures stem from logarithmic counting, independent of the chosen base. Maintaining fluency in Hartleys helps engineers interpret legacy documentation, calibrate logarithmic instruments, and communicate across disciplines that favour decimal notation.

Whenever reporting information quantities, specify the logarithmic base to avoid confusion. If bit-based tools are used, convert results into Hartleys for stakeholders accustomed to decimal digits. This discipline keeps the Hartley unit relevant, aligning with the consistent notation practices advocated in the ISO 80000-13 standard.