1 of 25

The relevance of communication theory for theories of representation

Stephen Francis Mann

stephenfmann@gmail.com

stephenfmann.com

1

bit.ly/sfm20240326

2 of 25

2

Cover & Thomas (2006:2)

bit.ly/sfm20240326

3 of 25

  1. The consensus view
    • Shannon warned that his theory ignores meaning
    • Information is everywhere; representation is rare
  2. The teleosemantic view
  3. Future directions

3

Cover & Thomas (2006:2)

Overview

bit.ly/sfm20240326

4 of 25

  • The consensus view
    • Shannon warned that his theory ignores meaning
    • Information is everywhere; representation is rare
  • The teleosemantic view
  • Future directions

4

Cover & Thomas (2006:2)

Overview

bit.ly/sfm20240326

5 of 25

5

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

0

1

T

H

E

C

A

T

S

A

T

T

H

E

C

A

T

S

A

T

bit.ly/sfm20240326

6 of 25

6

T

H

E

C

A

T

S

A

T

10100

01000

00101

11011

00011

00001

10100

11011

10011

00001

10100

T

H

E

C

A

T

S

A

T

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

0

1

Message

Signal

A → 00001

B → 00010

C → 00011

D → 00100

E → 00101

10100

01000

00101

11011

00011

00001

10100

11011

10011

00001

10100

bit.ly/sfm20240326

7 of 25

7

T

H

E

C

A

T

S

A

T

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

T

H

E

C

A

T

S

A

T

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

0

1

Message

Signal

A → 1010

B → 1110100

C → 00011

D → 11000

E → 001

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

bit.ly/sfm20240326

8 of 25

8

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

1111

0101

011

0000

00001

1010

0111

0000

0111

1010

1011

T

H

E

C

A

T

S

A

T

T

H

E

C

A

T

S

A

T

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Entropy

Surprisal

Transmission rate

0

1

Message

Signal

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

bit.ly/sfm20240326

9 of 25

9

T

H

E

C

A

T

S

A

T

T

H

E

C

A

T

S

A

T

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Shannon (1948)

Shannon & Weaver (1949)

*

0

1

Message

Signal

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

bit.ly/sfm20240326

10 of 25

10

T

H

E

C

A

T

S

A

T

T

H

E

C

A

T

S

A

T

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Shannon (1948)

Shannon & Weaver (1949)

Bar-Hillel & Carnap (1953)

*

*

*

0

1

Message

Signal

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

bit.ly/sfm20240326

11 of 25

11

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

1111

0101

001

0000

00011

1010

1111

0000

0100

1010

1111

T

H

E

C

A

T

S

A

T

T

H

E

C

A

T

S

A

T

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Shannon (1948)

Shannon & Weaver (1949)

*

*

*

Dretske (1981)

Dennett (1983)

Neander (2017)

etc

*

Bar-Hillel & Carnap (1953)

0

1

Message

Signal

bit.ly/sfm20240326

12 of 25

  • The consensus view
    • Shannon warned that his theory ignores meaning
    • Information is everywhere; representation is rare
  • The teleosemantic view
  • Future directions

12

Cover & Thomas (2006:2)

Overview

bit.ly/sfm20240326

13 of 25

13

Cover & Thomas (2006:2)

bit.ly/sfm20240326

14 of 25

P1: Mutual information cannot tell us about semantic content

P2: ?

C: Communication theory cannot tell us about semantic content

14

Mutual information

All the tools and concepts of communication theory

“Shannon information”

bit.ly/sfm20240326

15 of 25

  • The consensus view
    • Shannon warned that his theory ignores meaning
    • Information is everywhere; representation is rare
  • The teleosemantic view
  • Future directions

15

Cover & Thomas (2006:2)

Overview

bit.ly/sfm20240326

16 of 25

16

Millikan (1984, 2004)

Shannon (1959)

Lewis (1969)

Skyrms (2010)

Shannon (1948)

Artiga (2016)

Martínez (2019)

bit.ly/sfm20240326

17 of 25

  • The consensus view
    • Shannon warned that his theory ignores meaning
    • Information is everywhere; representation is rare
  • The teleosemantic view
  • Future directions

17

Cover & Thomas (2006:2)

Overview

bit.ly/sfm20240326

18 of 25

Recap

Future directions

  1. Use interventionism to show that increased success really does require appealing to the correspondence between signals and signifieds.
  2. Precisify the relationship between mutual information (correspondence between types) and semantic content (correspondence between tokens)
  3. Use more complicated models to capture more complex representations.
  • Communication theorists attribute representational content to signals.
  • Communication theory has more tools than measures of correlation.
  • Teleosemantics justifies representational talk because sender-receiver models are a special case of the teleosemantic model.

18

bit.ly/sfm20240326

19 of 25

References I

Adriaans, P. (2019). Information. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Spring 2019). Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/spr2019/entries/information/

Artiga, M. (2016). Teleosemantic modeling of cognitive representations. Biology & Philosophy, 31(4), 483–505. https://doi.org/10.1007/s10539-016-9525-3

Baker, B. (2021). Natural information, factivity and nomicity. Biology & Philosophy, 36(2), 26. https://doi.org/10.1007/s10539-021-09784-4

Bar-Hillel, Y., & Carnap, R. (1953). Semantic Information. The British Journal for the Philosophy of Science, 4(14), 147–157. http://www.jstor.org/stable/685989

Cover, T. M., & Thomas, J. A. (2006). Elements of Information Theory (2nd ed.). John Wiley & Sons.

Dennett, D. C. (1983). Intentional systems in cognitive ethology: The ‘Panglossian paradigm’ defended. Behavioral and Brain Sciences, 6(3), 343–355. https://doi.org/10.1017/S0140525X00016393

Dennett, D. C. (2017). From Bacteria to Bach and Back: The Evolution of Minds. Penguin UK.

Dretske, F. I. (1981). Knowledge and the flow of information. MIT Press.

Floridi, L. (2019). Semantic Conceptions of Information. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Winter 2019). Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/win2019/entries/information-semantic/

Godfrey-Smith, P., & Sterelny, K. (2016). Biological Information. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Summer 2016). http://plato.stanford.edu/archives/sum2016/entries/information-biological/

Isaac, A. M. C. (2018). The Semantics Latent in Shannon Information. The British Journal for the Philosophy of Science. https://doi.org/10.1093/bjps/axx029

19

bit.ly/sfm20240326

20 of 25

References II

Kirchhoff, M. D., & van Es, T. (2021). A universal ethology challenge to the free energy principle: Species of inference and good regulators. Biology & Philosophy, 36(2), 8. https://doi.org/10.1007/s10539-021-09780-8

Lean, O. M. (2014). Getting the most out of Shannon information. Biology & Philosophy, 29(3), 395–413. https://doi.org/10.1007/s10539-013-9410-2

Lewis, D. (1969). Convention: A Philosophical Study. Blackwell.

Lombardi, O., Holik, F., & Vanni, L. (2015). What is Shannon information? Synthese, 193(7), 1983–2012. https://doi.org/10.1007/s11229-015-0824-z

MacKay, D. J. (2003). Information theory, inference and learning algorithms. Cambridge University Press. http://www.inference.org.uk/mackay/itila/book.html

Martínez, M. (2019). Deception as cooperation. Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences, 77, 101184. https://doi.org/10.1016/j.shpsc.2019.101184

Millikan, R. G. (1984). Language, Thought, and Other Biological Categories. MIT Press.

Millikan, R. G. (2004). Varieties of Meaning. MIT Press.

Neander, K. (2017). A Mark of the Mental: In Defense of Informational Teleosemantics. MIT Press.

Owren, M. J., Rendall, D., & Ryan, M. J. (2010). Redefining animal signaling: Influence versus information in communication. Biology & Philosophy, 25(5), 755–780. https://doi.org/10.1007/s10539-010-9224-4

Piccinini, G., & Scarantino, A. (2011). Information processing, computation, and cognition. Journal of Biological Physics, 37(1), 1–38. https://doi.org/10.1007/s10867-010-9195-3

Rathkopf, C. (2017). Neural Information and the Problem of Objectivity. Biology & Philosophy, 32(3), 321–336. https://doi.org/10.1007/s10539-017-9561-7

20

bit.ly/sfm20240326

21 of 25

References III

Shannon, C. E. (1948). A Mathematical Theory of Communication (Part 1). Bell System Technical Journal, 27(3), 379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x

Shannon, C. E. (1959). Coding theorems for a discrete source with a fidelity criterion. In N. J. A. Sloane & A. D. Wyner, Collected Papers (pp. 325–350). Wiley-IEEE Press.

Shannon, C. E., & Weaver, W. (1949). The Mathematical Theory of Communication. University of Illinois Press.

Shea, N. (2018). Representation in Cognitive Science. Oxford University Press.

Skyrms, B. (2010). Signals: Evolution, Learning, and Information. Oxford University Press.

Sprevak, M. (2020). Two kinds of information processing in cognition. Review of Philosophy and Psychology, 11, 591–611. https://doi.org/10.1007/s13164-019-00438-9

Timpson, C. G. (2006). The Grammar of Teleportation. The British Journal for the Philosophy of Science, 57(3), 587–621. https://doi.org/10.1093/bjps/axl016

21

bit.ly/sfm20240326

22 of 25

22

Shannon (1948:379)

bit.ly/sfm20240326

23 of 25

23

bit.ly/sfm20240326

24 of 25

24

Shannon originally measured mutual information here

Philosophers usually measure it here

bit.ly/sfm20240326

25 of 25

25

Definition

Citation

Surprisal

Adriaans 2019:54; MacKay 2003:32

Entropy

Adriaans 2019:5 (implied); Lean 2014:396; Timpson 2006:614; Baker 2021:3 (implied)

Relative entropy

Kirchhoff & van Es 2021:21

Mutual information

Dennett 2017:§6; Shea 2018:78 n.5;

Owren et al 2010:759 passim; Isaac 2018:3

What is measured by one or more measures

Sprevak 2020:593;

Piccinini & Scarantino 2011:19

Concept/linguistic sense of “information”

Godfrey-Smith & Sterelny 2016; Floridi 2019; Rathkopf 2017:328; Lombardi et al. 2015

“Shannon information” is vaguely defined

bit.ly/sfm20240326