Incorporating dynamics and graph priors in Bayesian networks
Sep 17th 2024
BMI/CS 775 Computational Network Biology�Fall 2024
Sushmita Roy
Plan for this section
2
Plan for today
The Sparse candidate algorithm for Structure learning in Bayesian networks
4
Some comments about choosing candidates
5
Plan for today
Readings
Gene expression dynamics in timeseries experiments
Genes
Time
Chasman et al., 2017 Cell Systems; White et al 2017 eLife
zebrafish development
human spinal cord differentiation
Time
Key questions about modeling network dynamics
Dynamic Bayesian networks
Dynamic Bayesian Networks (DBNs)
A DBN for p variables and T time points
p
t=1
X11
X21
Xp1
…
Dependency at the first time point
X3: Variables at time t=3
X1
X2
Xp
…
2
2
2
X1
X2
Xp
…
3
3
3
X1
X2
Xp
…
T
T
T
t=2
t=3
t=T
…
…
…
Stationary assumption in a DBN
Due to this assumption, we only need to specify dependencies between two sets of variables
The stationary assumption states that the dependency structure of and parameters do not change with t
p
t
X1t
X2t
Xpt
…
X1t+1
X2t+1
Xpt+1
…
t+1
X1
X2
Xp
…
1
1
1
X1
X2
Xp
…
2
2
2
X1
X2
Xp
…
T
T
T
t=1
t=2
t=T
…
…
…
Later we will see an example of an approach where this assumption is relaxed
Computing the joint probability distribution in a DBN
Parents of Xit defined by the graph G
Learning problems in DBNs
Plan for today
Why prior-based structure learning?
Bayes rule
Posterior
Prior
Data likelihood
Marginal likelihood
Adapted from Prof. Mark Craven’s slides for BMI/CS 576
Prior-based structure learning
Bayesian formulation of network inference
Optimize posterior distribution of graph given data
Algorithm
Y1
X1
X5
Y2
X2
Model prior
Posterior distribution
Data likelihood
Auxiliary data sources to serve as priors in regulatory networks
ChIP-seq peaks
Image credit: Alireza Fotuhi Siahpirani
How to define priors on graphs?
Energy function on a graph
Energy function of a graph
Werhli, Adriano V., and Dirk Husmeier. 2007; Mukherjee and Speed 2008
Using the energy to define a prior distribution of a graph
Incorporating multiple sources of prior networks
Prior distribution incorporating multiple prior networks
The partition function for a prior over DBN
parents of ith variable
The partition function for a DBN prior
The partition function for a DBN prior
Learning problems in DBNs with priors
Plan for today
Bayesian Inference of Signaling Network Topology in a Cancer Cell Line (Hill et al 2012)
Applying DBNs to infer signaling network topology
Hill et al., Bioinformatics 2012
Application of DBNs to signaling networks
Integrating prior signaling network into the DBN
Data likelihood
Graph prior
Prior strength
Graph features
Original prior description from Mukherjee & Speed 2008, PNAS
Defining the prior
Calculating posterior probabilities of edges
Inferred signaling network using a DBN
Results are not sensitive to prior values
Inferred signaling network
Collapsed network
DBN
Using the DBN to make predictions
Experimental validation of links
Add MAPK inhibitor and measure MAPK and STAT3
Their success is measured by the difference in the levels of the targets as a function of the levels of the inhibitors
MAPK is significantly inhibited (P-value 5X10-4)
STAT3 is also inhibited (P-value 3.3X10-4)
MAPK
STAT3
Plan for today
Motivation
Non-stationary Dynamic Bayesian Networks (nsDBNs)
Non-stationary Dynamic Bayesian Networks
Adapted from Robinson & Hartermink 2010
X1
X3
X4
X2
t1
t2
Transition times
X1
X3
X3
X4
X1
X3
X4
X2
X1
X2
X4
X4
X1
X3
X4
X2
Edge set change between time windows
G1
G2
G3
Non-stationary Dynamic Bayesian Networks continued
Prior over m graphs; can be used to incorporate our prior knowledge of how the graphs transition
Take away points
Take away points contd
References