1 of 8

Transmission Neural Networks:

From Virus Spread Models to Neural Networks

Xuechen Liu

2022.09.16

2 of 8

Background: Virus spread dynamics

The probability of node i being infected at time k+1 satisfies

Defining negative-log-negative probability (Shannon’s information)

By simple substitution, we finally will get the infection estimator

3 of 8

Transmission Neural Networks (TransNN)

We would like to parameterize the links with infection probabilities

By defining the input and output states

And (yet again, math substitutions) we find

4 of 8

Transmission Neural Networks (TransNN)

So the start state can be re-defined as

Where we define the TLogSigmoid activation function from the derivation

It can be further parameterized to

The activation function is related to the “links/transitions”, not only a function of “nodes”. This has some biological inspirations

5 of 8

Equivalent Representations via Neural Nets

We consider the virus spread models with multiple particle transmissions

By defining the I/O states in a similar way

We can describe the dynamics of NN as

And control the weights in a way that similar to the synaptic networks

6 of 8

TransNN as DNN

We then regard the TransNN as a “DNN”

The objectives then become

It and its variants (multi-path transmission models) fits the assumptions of a universal function approximator - for a continuous function space C(R)

7 of 8

For me - More like a Communication System?

  • Shannon’s communication system for information retrieval shares some common properties with this synaptic-like neural models
    • (or in fact, synaptic networks themselves)
  • Differences:
    • The noise source is “external knowledge”.
    • The neural transmitters’ “weights” are probabilities, in a “noiseless” environment
      • So should we need noises? I need neural scientists to help me
    • The information systems may not encode/model the transitions
    • For continuous functions, the communication system may not be a universal function approximator due to limited “capacity” of information transmitted. Or at least, it needs mathematical support

8 of 8

Main Takeaways

  • TransNN is derived from virus spread models and has biological connections to it
  • It equals itself with a DNN as an universal function approximator
  • It provides an analogy on interpreting neural network models

  • Original authors mentioned its future applications on modular design of TransNN
  • More practical applications apart from FashionMNIST can be done. E.g. Virus spread