Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks
Francesca Mastrogiuseppe and Srdjan Ostojic
Presented by Jiating Zhu
MA 861
2021.12.09
Recurrent network models
random
structure
Actual cortical connectivity appears to be neither fully random nor fully structured
The results of previous studies suggest that
a minimal, low-rank structure added on top of random recurrent connectivity
may provide a general and unifying framework for implementing computations in recurrent networks.
A unified conceptual picture of how connectivity determines dynamics and computations is missing
In such networks, the dynamics are low dimensional and can be directly inferred from connectivity using a geometrical approach.
The connectivity is a sum of a random part and a minimal, low-dimensional structure.
Recurrent network models
In βAttractor Dynamics in Networks with Learning Rules Inferred from In Vivo Dataβ :
Random connectivity + Structured connectivity
connectivity matrix
structured, controlled matrix
uncontrolled,random matrix
Random connectivity matrix
is a Gaussian all-to-all random matrix, where every element is drawn from a centered
normal distribution with variance 1/N.
Random strength g scales the strength of random connections in the network,
Bigger g
Structured connectivity
Rank one
Simple connectivity structure (rank one)
r_i is the ith row vector
Overlay between connectivity vectors
The connectivity vectors
n
m
u
Overlap along the direction of u
Determinant Expansion by Minors
The strength of the connectivity structure
only non-zero eigenvalue of P is given by the scalar product
Random network with unit-rank structure
Rank-one
Spontaneous vs External inputs
Spontaneous
External inputs
Mean-field analysis: Assumptions
Mean-field analysis: Spontaneous
Spontaneous
Mean-field analysis: Effective noise
Under the hypothesis that in large networks, neural activity decorrelates (more specifically, that activity is independent of its outgoing weights),
Mean-field analysis: Activity structure strength
quantifies the overlap between the mean population activity vector and the left-connectivity vector n.
Mean-field analysis: Noise correlation function
when , activity decorrelates:
When , N terms vanishes in the large N limit because of the 1/N^2 scaling
Mean-field analysis: Stationary solutions
Mean and variance:
All neurons have the same variance
Random vs Structure
In Assignment 2
One-dimensional spontaneous activity
m
x_trial
π=π m
The activity of the network at equilibrium is organized in one dimension along the vector m
Variance against Random strength
Two equilibrium states
When g is small, there are two Equilibrium states
Random strength g
Equilibrium states against Random strength
When g is bigger,
|π_i| becomes smaller
g
[π]
π
|π_i|
0
0
Random strength g
Stationary vs Chaotic
Mean-field analysis: External inputs
u
v
I projected on u, x1,x2, and h
u, x1,x2, and h are orthogonal to each other
Mean-field analysis: Effective coupling input
Mean-field analysis: Response to external Inputs
External inputs
Response to an external Input : Two-Dimensional Activity
I
π
m
π m
Response to an external Input: Mutually orthogonal case
I, m, and n mutually orthogonal
Response to an external Input: Non-zero overlap with left connectivity vector
take an input pattern which overlaps with n along x2. Keep fixed and vary the component of the input along n by increasing
m and n mutually orthogonal, but I has a non-zero overlap with n
π
The external input suppresses one stable state
Input pattern correlates with n but is orthogonal to the structure overlap direction
m and n have non-zero overlap
π
mean-field equations in the case of spontaneous dynamics.
Simple Go-Nogo Discrimination Task
Choosing m = w and n = I^A therefore provides the simplest unit-rank connectivity that implements the desired computation.
g = 0.1
Bigger g (my simulation)
g = 2.1
Go
Nogo
Noisy Detection Task
The network is given a noisy input c(t) along a fixed, random pattern of inputs I. The task consists in producing an output if the average input c is larger than a threshold π.
g = 0.8
Context-Dependent Go-Nogo Discrimination Task
g = 0.8
Low rank structure connectivity
Rank r<<N
A sum of unit-rank terms
Different m vectors are linearly independent, and similarly for n vectors.
A Context-Dependent Evidence Integration Task
g = 2.1 # Random strength: selected to be large, so that the network is in a chaotic regime
Thanks!