1 of 39

Communication, Information, and Digital Encoding:

Designing Signals for Semiotic Systems

Martin Irvine

Communication, Culture & Technology Program

Georgetown University

Home Page: irvine.georgetown.domains

email: irvinem@georgetown.edu

A telegraph keyboard coder and receiver/recorder:

early “interactive interface”.

2 of 39

Using semiotic theory to de-blackbox electrical engineering models for “information theory”

  • How we can design electrical signals as structured semiotic substrates for communication, information, media, and computing systems?

  • There is a direct path from Samuel More to the iPhone for using re-implementable concepts for electrical switches, encoding states of energy as signals, and producing outputs for human-perceptible patterns.

Designed structured physical substrates�for human sign and symbol systems

3 of 39

“About five years ago, on my voyage from Europe [1832], the electrical experiment of Franklin, upon a wire some four miles in length, was casually recalled to my mind in conversation with one of the passengers, in which experiment it was ascertained that the electricity traveled through the whole circuit in a time not appreciable, but apparently instantaneous. It immediately occurred to me that, if the presence of electricity could be made VISIBLE in any desired part of this circuit, it would not be difficult to construct a SYSTEM OF SIGNS by which intelligence could be instantaneously transmitted. The thought, thus conceived, took strong hold of my mind in the leisure which the voyage afforded, and I planned a system of signs, and an apparatus to carry it into effect.”

(Samuel Irenaeus Prime, Life of Samuel Morse, 1875)

Samuel Morse recalling his first conception of an electromagnetic telegraph in 1832:

From Samuel Morse to Claude Shannon

Electric Circuits for Encoding Symbolic Information (1832 - 1948)

4 of 39

4

Electromagnetic telegraph circuit and early diagram and code chart:

Proto-binary switched circuit code.

5 of 39

Telegraph messages:

communication as mapping abstract code units to alphabet

sender writes message in alphabetic words >

transpose alphabetic units into pulse code system >

transmit coded message over telegraph system >

receiver transposes pulse code to alphabet >

write out (copy) message in alphabetic words >

recipient reads message sent >

write reply or new message >

encode message > ... n

The “Long tail” of Morse’s concept:

From transposing the dot-dash system to binary digital text encoding schemes (18902-1950s), to ASCII, to Unicode today.

6 of 39

Cumulative Combinatorial Technologies for Data Abstractions

at Historical Intersections of Conditions

The “long tail” of Samuel Morse’s design concepts and patents (1834-37)

1837-1849: Morse’s patent for the electromagnetic telegraph with co-developer Alfred Vail had extensive financial backing by Vail's cousin Theodore Vail (who was the founder of AT&T) they were able to secure the patent in 1849 against several other inventors of similar devices.

Further development of electric signal technologies by Alexander Graham Bell and Thomas Edison.

1925: Bells Labs: Western Electric Laboratories and the engineering department of AT&T merged to form Bell Laboratories as an independent division of AT&T and WE; later fully owned by AT&T.

From engineering teams at Bell Labs/AT&T:

1947: Invention of the transistor

1948: Formalization of Information theory: "A Mathematical Theory of Communication", by Claude Shannon, published in the Bell System Technical Journal.

1950s: Further development of binary data systems and computer architectures

1960s: UNIX operating system developed for time-sharing computer systems, the building blocks for first packet networks and ARPANet/Internet technologies.

1970s: C Programming Language for UNIX systems

1980s: Cell phone transmission standards developed; C++ for distributed computing

1996: AT&T spins off Bell Labs in sale to Lucent Technologies

The Cascading Consequences of Building on the Principle for

Making electricity visible in desired parts of a circuit for constructing a system of signs where intelligence could be instantaneously transmitted ...

7 of 39

Instead of the term “information theory”, we should use the phrase “electrical signals transmission design theory,” since this was how Shannon and his community conceived their practice.

A page from Shannon’s MIT thesis (1938) on applying Boolean symbolic logic to sequences of switched circuits (for telecommunications networks).

Shannon’s thesis was

“one of the most important master's theses ever written...a landmark in that it helped to change digital circuit design from an art to a science."

H. H. Goldstine, The Computer from Pascal to Von Neumann

“Information” as a system of quantifiable units, and the concept of the “bit” (binary digit) both came from the telecommunications field and applied mathematics in electrical engineering.

Shannon built on 20 years of work in telecommunications and saw how Boolean logic (two-value system) mapped onto the circuit switch configurations: the beginning of the “bit” (binary digit) as mathematical-logical map (homology) with electronic states.

8 of 39

The conceptual steps leading to Claude Shannon’s mathematical model

for reliable transmission of electronic signals (= structured material sign vehicles)

The Signal Transmission Problem for electrical engineering in telecommunications, 1920s-1940s, as stated Claude Shannon's Mathematical Theory of Communication:

Claude Shannon’s famous opening definition for information engineering: (key terms in bold)

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design.

C. E. Shannon, "The Mathematical Theory of Communication" (Bell Labs, 1948)

9 of 39

In Claude Shannon’s famous papers on “information” or “communication” theory as an engineering problem, he frames the research and theory as a design problem for a semiotic subsystem:

  • Circuit design (modeled on Boolean symbolic logic)�
  • Mathematical and logic design for electrical current and radio waves (“information theory”)�
  • “Information design” in binary (base-2) mathematics as the most efficient system for electrical circuits:� open/closed, on/off, present/absent, true/false, 1/0

With the model of the “bit” (binary digital), Shannon introduced the symbolic method now used in all information systems and computing: encoding “information” or “data” (binary representations of the symbols that mean things) and operations and relations for “data” (the logic and programming for transformations and interpretations of the “data” representations) -- using the same binary encoding method.

It is a system design for using symbols that mean and symbols that do (perform actions) in the same mathematical-electronic structures.

10 of 39

Consider these observations as starting points to think with from the design point of view:

We didn’t “discover” electricity, we designed it!

“Information theory,” as defined in electrical engineering, is an engineering solution to a semiotic problem -- it is a model for “designer electronics”:

i.e., without a design method for reliably representing (encoding) and transmitting interpretable patterns of a symbolic system (language, writing, images, and sounds), “communication” with the medium of electronic units would be impossible.

The mathematical models used in engineering for defining efficient electronic encoding and decoding in combinations of binary-valued units (bits) is a design for a semiotic substrate (digital electronics), which can be further designed to encode representations of all human sign and symbol systems.

11 of 39

Background: “Information” as a Design for Mapping Physical Symbol Structures to Energy:

All modern “information theory” is founded on signals processing designed and specified

for a defined “slice” of the Electromagnetic Spectrum.

12 of 39

13 of 39

Claude Shannon's initial non-semantic "communication" model

for a mathematical (statistical) description of signals processing.

OK, in bracketing off “meaning” from the engineering problem, what does Shannon’s theory assume about where messages come from (motivations in communities) and where they are going (contexts of reception and interpretation) for the signals “shaped” in the electronics designed for telecommunications?

14 of 39

First revision/expansion of the model: with feedback loop from receiver/destination

In the Shannon-Weaver model, information is a function of a cybernetic system that includes feedback for adjusting the system to achieve its goals (successful signal transmission).

Note one-way (unidirectional) path model:

following electronic circuits from point to point.

“Feedback” is modeled as non-semantic and regulative, a meta-signal.

Where is the larger pragmatic context [situated uses and modes of expression]:

(1) unlimited prior “messages” (expressions and representations already said, already understood, available in social memory)

(2) projection to unlimited new or additional “messages” that result from interpreting, discussing, debating, refuting, ironizing, or satirizing a received message?

15 of 39

Floridi’s revision of the standard model

Floridi’s Model

of Information Theory:

Trying to get semantics back in, but with an incomplete conceptual model!

“Semantic content” is not something built-in to information, and it is not a category in a tree structure of branching binary classes.

So, where “is” the meaning that we interpret when we use “information”?

16 of 39

Sign system selection ⇒

Types ⇒ Tokens ⇒ Signal mappings

Signal ⇒ Tokens ⇒ Type mappings ⇒ Sign system recognition

Social-semiotic “envelope” assumed in

any design for an information system:

The uses of sign and symbol systems capable of electrical encoding and transmission.

Information Theory (= Signals Structuring) is designed as a semiotic subsystem for re-tokening (re-instantiating) the physical substrates symbol systems in electrical patterns.

Interpreters in dialogic meaning-making communities, who use intentional, purposive sign/symbol representations in both the “sender” and receiver” positions. Communication is goal directed (telic) even in Shannon’s model.

Networks of other expressions, before and after an individual transmitted message

17 of 39

Legacy of container, conduit/channel model

Ferdinand de Saussure's linguistic communication picture: moving contents from one head to another through the channel of spoken language.

Illustration in the English translation of De Saussure’s Course in General Linguistics textbook (1926/1959).

(Of course, we know that language cannot work this way!)

18 of 39

“Electrical circuit phenomena...are characterized by an invariance

with respect to a shift of origin in time.” [That is, when engineered.]

(Norbert Wiener, Cybernetics, ix).

Information Foundation: Designing (Engineering) Invariance�in a System for Replicating Discrete Patterns (Signals)

Across Locations in Space and Time,

or,

Designing Electronics

Important engineering and mathematical concept:

Invariance = something that does not vary in value (does not change values, an equivalence), and interpreted at any measuring/detection point in a system as being functionally identical. The structured unit designed to hold a value is termed an invariant.

Wiener assumes that for electrical circuit phenomena to be useful as information/communication media, then circuits, energy patterns, must be designed (structured) to fulfill this function: electricity does not do this by nature.

19 of 39

OK, if the technical problem is successful replication of a signal from a source to a destination, how can we design electricity and its physical media to do this (= the engineering problem)?

The engineering design problem is all about structuring patterns in the physical energy substrates, designing a controlled system for structure-preserving structures (the structures can be replicated and corrected while passing through different parts of the system).

Imposing logical structure creates regular, reliable physical patterns which can be used as subsystems for higher level meaningful patterns (data, representations of symbol systems)

20 of 39

Electrical communications must use the properties of electricity:

varying voltages (quantities in waves) transmitted through a medium or channel (wires, radio waves) only become signals when defined as structures in finite quantities of time.

“Information” as data transmission, then, is fundamentally about designing physical unit quantities as functions of Time.

How can electrical voltage units physical media, and sending and receiving processes be designed to successfully (reliably) receive what was transmitted (when all physical systems have “noise” and imperfect variations in current values)?

What are the minimal decomposable units of energy (signal units) that can be transmitted in a medium over Time = Tn?

Redescribing Shannon: what is the optimal design for transmitting a signal representing minimal decomposable units of data from one “state space” (one state of energy in one place) to another “state space” through time in a physical channel in order to reproduce the transmitted signal-state without loss (a highest probability replica of the energy state transmitted).

Designing Electricity and Physical Media:

Telecommunications & Computing Architectures

21 of 39

22 of 39

Digital Signals = “Quantized” Units = Discrete Measurements

of Time-Energy Encodable as Binary Representations (Bits, Bytes)

Example: Digital Sound Processing (DSP)

The Pulse Code Modulation (PCM) Standard

23 of 39

Example of digital sound units used in everyday applications:

Software mediation through shifts in material substrates:

Digital voice (or any human audio signal) in system of ADC (analogue-digital encoding) and DAC (digital-analogue decoding): interpretable signals pass through different states of energy (in transducers) that must translate back and forth to human perceptible sounds in a sign systems (e.g., spoken language, music).

24 of 39

If you took an intro computer science course, or Digital Engineering 101, you would learn the engineering diagram symbols for “Logic Gates” (switches) and “truth tables” for Boolean binary logic. This is all standardized in the circuits of computing processors.

Note: these diagrams and truth tables are human abstract symbolic systems, imposed on circuits, not properties of electricity!

Today we have core processors (CPUs) with millions of these switches that convert bit inputs to outputs over synchronized states of time (clock cycles).

“Digital Processing” =

Interpretations via Computations in Boolean Logic

Digital Logic Design

25 of 39

A computer program, when “running” in RAM memory, continues an ongoing dialog with an operating system, memory locations, and processors (with all the millions of logic gates), and goes through billions of “state changes” per second, as directed by the binary clock (= tick/tock, go/stay, do/hold).

A program activates chains of different kinds of logic gates in the processors to perform operations, calculations, and transitions on our data “inputs”, “reads” and “writes” binary data from RAM.

But it’s the digital clock that makes the whole system perform synchronized “state transitions” to give us “computations” and create the representations that we see.

0 1 0 1 0 1 0 1

Clock pulse signals go through a CPU for logic switch sequences, and to RAM memory cells for timing when to switch states representing bits.

We Have to “Design” Time to Do “Computations as Symbol Processing” ?!

26 of 39

Clock unit

ALU

Millions of Binary Logic Gates�in the circuits of the �“Arithmetical-Logical Unit” (ALU)

Programming “Code”

(“Symbols that Do”),

with bytes of Encoded Data “fetched” from memory units

(“Symbols that Mean”)

Follow the arrows!

Black arrows: process flow

Red arrows: control sequencing (“clock cycles”)

“Outputs” processed in software for perceptible

representations (symbolic forms),

or Actions

The Logical Design of a “Core Processor” (CPU)

27 of 39

Video Documentaries on Claude Shannon's Work and Influence

(We prove Shannon’s theory and mathematical models every day in everything we do in computing, digital media, and the Internet.)

Claude Shannon, Founder of Digital Revolution (AT&T Labs)

Claude Shannon, The Father of the Information Age

UC video.

  • Claude Shannon at Bell Labs (Bell Labs Documentaries)

28 of 39

Information Theory for Data and Electronics is Fundamentally Necessary.

But for Understanding the Meaning Systems that Motivate Signals, we Need the Whole Picture as Studied in Other Disciplines.

29 of 39

Filling in the Model for the Meaning Systems

that Motivate Signals and Physical Structures:

“Information” as Semiotic Subsystem

30 of 39

Explaining "Information" in the Shannon Data Transmission Model

“I think perhaps the word ‘information’ is causing more trouble . . . than it is worth, except that it is difficult to find another word that is anywhere near right. It should be kept solidly in mind that [information] is only a measure of the difficulty in transmitting the sequences produced by some information source.”

Warren Weaver, “The Mathematics of Communication,” Scientific American, July 1949, p. 12.

What's not accounted for in the data/signal transmission model and why:

Meanings, intentions, contexts, situations, agency:

The meaning environment and context that motivates messages and all symbolic expressions are not properties of signals (= material/physical sign components).

Meanings cannot be represented because they are not present as material properties in signals and physical media. This is precisely what it means to be using symbols: meaning is supplied by the symbol users in specific situations of use.

Since Shannon was modeling the pre-semantic transmission of signals in a physical medium (electric current), the whole environment of communication and motivation of meaning is bracketed off as a separate layer not involved in this process.

Why? Because meanings and intentions are held and understood by signal transmitters and receivers!

So, where are the meanings? (And how do we better define this question?)

31 of 39

We can’t get “meanings” back into the information (data signal) model because meanings are not “in” anything, most of all not as properties or features of physical/material structures.

Semiotics 101:

Meanings, values, intentions, or purposes are not “in” perceptible expressions or representations, and not “in” individual minds to be conveyed as “content” to other minds.

Meanings are enacted or performed by meaning-making agents, humans with symbolic cognition in communities with shared sign and symbol systems.

Correlations to meanings must be performed by correlators who enact a triadic process of mapping perceptible structures (representations) to their meaning systems by means of interpreting-correlations developed dynamically over time.

What we recognize, interpret, and understand as meaning is not in the symbolic-physical system; it is the system. Meanings are what signs and symbols as shared representations of symbolic cognition allow us to do.

It makes no sense to talk about “meanings” as something “outside” what we express and represent in sign systems. “Meaning” is not missing in information theory. Information theory presupposes and depends on motivated signals, or the would not be any information to theorize!

32 of 39

Signals, information, and data are best understood as � (1) designs for structure preserving structures in the physical components of electronic systems � (2) for the purpose of producing unlimited tokens (physical instances) of interpretable sign and symbol patterns in outputs of the designed system.

Working with Peirce’s model of sign and symbol processes:

All symbolic structures have a necessary material substrate for providing perceptible features and interpretable patterns.

“Meanings” are not properties of sign-vehicles, but begin with three primary relations.

Meanings are not “contents,” cannot be explained as “contents” of the signal- components of signs and symbols.

33 of 39

Information theory defines and specifies how we can create patterns in electrical units (digital and analogue) at these levels.

Another view of the whole process:

34 of 39

Consequences of conceptual metaphors in a theory:

transmission, channels, conduits, boxes

Boxed in with boxes and pipes.

Step One: Get out of the boxes!

Communication, information, and media theory has been dominated by conceptual spatial metaphors of containers/contents (boxes) and pipes (conduit and transit connectors).

Communication as modeled on a problem of transferring signals, signs, data, information in space, independent of agency of senders and receivers has no way accounting for the production of meaning (semiosis) in contexts and situations.

Understanding, interpreting, and expressing meanings is not a process of moving contents from one boxed location to another.

The signal transmission model omits everything involved in meaning cognition and the multiple levels of meanings used in a community of semiotic agents.

35 of 39

In the world of information science, meaning is often tied to the notion of representation. A principle that underlies the whole concept of computation is that one state can be represented by another state. The states need not be in the same system. To give two examples: the words you type on a keyboard can be represented by voltages and current flows inside an electronic computer; music that is performed by a human artist can be represented by a pattern of silver and black dots on a DVD. When a particular representation affects you, it has meaning to you. (41)

A remarkable feature of algorithms is that their logical structure transcends the physical system they are encoded in. This means that the outcome of an algorithm is independent of the device used to perform it… This is because an algorithm can be represented in different physical media, even in different languages. The transcendence comes about because symbol sequences can be copied, rearranged, and translated into other symbol patterns without losing the meaning or logical structure that characterizes the message. (44)

Information, Computation, Meaning, and Symbolic States

From: John E. Mayfield, The Engine of Complexity: Evolution as Computation.

New York: Columbia University Press, 2013.

36 of 39

How we can encode “symbols that mean and symbols that do”

as electronic, binary units:

Digital systems use signals that have two distinct values and circuit elements that have two stable states. There is a direct analogy among binary signals, binary circuit elements, and binary digits. A binary number of n digits, for example, may be represented by n binary circuit elements, each having an output signal equivalent to 0 or 1. Digital systems represent and manipulate not only binary numbers, but also many other discrete elements of information. Any discrete element of information that is distinct among a group of quantities can be represented with a binary code (i.e., a pattern of 0’s and 1’s). The codes must be in binary because, in today’s technology, only circuits that represent and manipulate patterns of 0’s and 1’s can be manufactured economically for use in computers. However, it must be realized that binary codes merely change the symbols, not the meaning of the elements of information that they represent. If we inspect the bits of a computer at random, we will find that most of the time they represent some type of coded information rather than binary numbers.

Digital information is always physical and material, and thus requires physical space and electrical energy:

The binary information in a digital computer must have a physical existence in some medium for storing individual bits. A binary cell is a device that possesses two stable states and is capable of storing one bit (0 or 1) of information. The input to the cell receives excitation signals that set it to one of the two states. The output of the cell is a physical quantity that distinguishes between the two states. The information stored in a cell is 1 when the cell is in one stable state and 0 when the cell is in the other stable state.

(From: M. Morris Mano and Michael D. Ciletti, Digital Design: With an Introduction to the Verilog HDL, 5th edition (Upper Saddle River, NJ: Prentice Hall, 2012)

37 of 39

Information Theory + Semiotics = The Fuller Picture of Communication Systems

Semiotic Bridge-Building in Computer Science:

George Dyson, Turing’s Cathedral (2012):

The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same. (Preface, ix)

A digital universe—whether 5 kilobytes or the entire Internet—consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information— structure and sequence— according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory, and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next. (3)

Peter Denning (describing the view in Paolo Rocchi, Logic of Analog and Digital Machines):

Information consists of (1) a sign, which is a physical manifestation or inscription, (2) a relationship, which is the association between the sign and what it stands for, and (3) an observer, who learns the relationship from a community and holds on to it. The observer as a member of a social community gives uniformity of interpretation and continuity over time.

38 of 39

Information Design as Semiotic System at Work:

The Internet

39 of 39

Information-Transmission Model at Work: Internet Protocol Suite: TCP/IP