DoctorAI and Medical Embeddings: Developing a Medical Sense
Connor Favreau
Data Science Intern at Providence Health and Services
The Takeaways
Neural Networks
Input (scalar, vector, matrix)
Hidden (nonlinear activation function, weights, biases)
Calculated Output (scalar, vector, matrix)
Actual Output (scalar, vector, matrix)
Cost Function
Backpropagation
Training
Recurrent Neural Networks
Britz D., 2015
Recurrent Neural Network Spin-offs
LSTMs (Long Short Term Memory)
GRUs (Gated Recurrent Units)
Adding Embeddings (word2vec/skipgram) Into RNNs
context
context
The cat chased a bird flying through the window.
target
The cat chased a bird __?__.
The Word2Vec Skip-Gram Algorithm
The cat chased a bird flying through the window.
context
context
target
What are Embeddings?
Mikolov et al., 2013b
Mikolov et al., 2013c
Applications
Recurrent Neural Networks and Embeddings in the Medical Field
Medical Data Usable for RNNs
DoctorAI: Medical Predictions from a GRU Framework
Choi E. et al., 2015
GRU Networks
Update Gate
Reset Gate
How much to update by
How much value to assign new inputs versus previous layer
Schraudolph N., 2014
Hsu C., 2017
From GRU to Predictions
Guo B., 2016
For Training
cross-entropy, summing for each time
Square between predicted and actual duration
Goal: Minimize this cost function
Medical Embeddings… Not Just Words in Sentences
Collection of a patient’s codes over a given period of time
Choi Y. et al., 2016
Med2Vec
Choi E. et al., 2016
Medical Embeddings from Text
Finlayson et al., 2014
Medical Embeddings from Text
Choi Y. et al., 2016
DoctorAI Results
Choi E. et al., 2016
The Takeaways
References/Good Links