1 of 41

Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks

Francesca Mastrogiuseppe and Srdjan Ostojic

Presented by Jiating Zhu

MA 861

2021.12.09

2 of 41

Recurrent network models

random

structure

  • fully random recurrent connectivity
  • display internally generated irregular activity that closely resembles spontaneous cortical patterns recorded in vivo
  • highly structured connectivity
  • every neuron belongs to a distinct cluster and is selective to only one feature of the task

Actual cortical connectivity appears to be neither fully random nor fully structured

3 of 41

The results of previous studies suggest that

a minimal, low-rank structure added on top of random recurrent connectivity

may provide a general and unifying framework for implementing computations in recurrent networks.

4 of 41

A unified conceptual picture of how connectivity determines dynamics and computations is missing

5 of 41

In such networks, the dynamics are low dimensional and can be directly inferred from connectivity using a geometrical approach.

The connectivity is a sum of a random part and a minimal, low-dimensional structure.

6 of 41

Recurrent network models

In β€œAttractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data” :

7 of 41

Random connectivity + Structured connectivity

connectivity matrix

structured, controlled matrix

uncontrolled,random matrix

8 of 41

Random connectivity matrix

is a Gaussian all-to-all random matrix, where every element is drawn from a centered

normal distribution with variance 1/N.

Random strength g scales the strength of random connections in the network,

Bigger g

9 of 41

Structured connectivity

Rank one

10 of 41

Simple connectivity structure (rank one)

r_i is the ith row vector

11 of 41

Overlay between connectivity vectors

The connectivity vectors

n

m

u

Overlap along the direction of u

12 of 41

Determinant Expansion by Minors

13 of 41

The strength of the connectivity structure

only non-zero eigenvalue of P is given by the scalar product

14 of 41

Random network with unit-rank structure

Rank-one

15 of 41

Spontaneous vs External inputs

Spontaneous

External inputs

16 of 41

Mean-field analysis: Assumptions

  • The DMF theory relies on the hypothesis that a disordered component in the coupling structure, here represented by , efficiently decorrelates single neuron activity when the network is sufficiently large.
  • low-rank term and the random term are statistically uncorrelated.
  • The structured connectivity is weak in the large N limit,i.e.,it scales as 1/N, while the random connectivity components scale as

17 of 41

Mean-field analysis: Spontaneous

Spontaneous

18 of 41

Mean-field analysis: Effective noise

Under the hypothesis that in large networks, neural activity decorrelates (more specifically, that activity is independent of its outgoing weights),

19 of 41

Mean-field analysis: Activity structure strength

quantifies the overlap between the mean population activity vector and the left-connectivity vector n.

20 of 41

Mean-field analysis: Noise correlation function

when , activity decorrelates:

When , N terms vanishes in the large N limit because of the 1/N^2 scaling

21 of 41

Mean-field analysis: Stationary solutions

Mean and variance:

All neurons have the same variance

22 of 41

Random vs Structure

In Assignment 2

23 of 41

One-dimensional spontaneous activity

m

x_trial

πœ‡=πœ…m

The activity of the network at equilibrium is organized in one dimension along the vector m

24 of 41

Variance against Random strength

25 of 41

Two equilibrium states

When g is small, there are two Equilibrium states

Random strength g

26 of 41

Equilibrium states against Random strength

When g is bigger,

|πœ‡_i| becomes smaller

g

[πœ™]

πœ…

|πœ‡_i|

0

0

Random strength g

27 of 41

Stationary vs Chaotic

28 of 41

Mean-field analysis: External inputs

u

v

I projected on u, x1,x2, and h

u, x1,x2, and h are orthogonal to each other

29 of 41

Mean-field analysis: Effective coupling input

30 of 41

Mean-field analysis: Response to external Inputs

External inputs

31 of 41

Response to an external Input : Two-Dimensional Activity

I

πœ‡

m

πœ…m

32 of 41

Response to an external Input: Mutually orthogonal case

I, m, and n mutually orthogonal

33 of 41

Response to an external Input: Non-zero overlap with left connectivity vector

take an input pattern which overlaps with n along x2. Keep fixed and vary the component of the input along n by increasing

m and n mutually orthogonal, but I has a non-zero overlap with n

πœ…

34 of 41

The external input suppresses one stable state

Input pattern correlates with n but is orthogonal to the structure overlap direction

m and n have non-zero overlap

πœ…

mean-field equations in the case of spontaneous dynamics.

35 of 41

Simple Go-Nogo Discrimination Task

Choosing m = w and n = I^A therefore provides the simplest unit-rank connectivity that implements the desired computation.

g = 0.1

36 of 41

Bigger g (my simulation)

g = 2.1

Go

Nogo

37 of 41

Noisy Detection Task

The network is given a noisy input c(t) along a fixed, random pattern of inputs I. The task consists in producing an output if the average input c is larger than a threshold πœƒ.

g = 0.8

38 of 41

Context-Dependent Go-Nogo Discrimination Task

g = 0.8

39 of 41

Low rank structure connectivity

Rank r<<N

A sum of unit-rank terms

Different m vectors are linearly independent, and similarly for n vectors.

40 of 41

A Context-Dependent Evidence Integration Task

g = 2.1 # Random strength: selected to be large, so that the network is in a chaotic regime

41 of 41

Thanks!