Deep Learning with Elixir: Axon, Nx and Livebook
An exploration of Elixir’s emerging Data Science and Deep Learning ecosystem
Who am I?
Robert Bates
Principal Engineer II @ Veloxiti AI Works
Former Research Scientist II @ Design & Intelligence Lab, Georgia Tech
arpieb most everywhere online
Been tinkering with Elixir since 2015 with interests in ETL, ML, KBAI, and NLP
Participate in the EEF Machine Learning WG�https://erlef.org/wg/machine-learning
What we’re going to explore
Machine Learning from 30k Feet
Elixir from 10k Feet
Axon, Nx and Livebook at Ground Level
Machine Learning from 30k Feet
We’re not going deep here, just basic definitions and a quick survey of the ML landscape
What is Machine Learning?
Machine learning (ML) is the study of computer algorithms that can improve automatically through experience and by the use of data.
It is seen as a part of artificial intelligence.
Machine learning algorithms build a model based on sample data, known as "training data", in order to make predictions or decisions without being explicitly programmed to do so.
https://en.wikipedia.org/wiki/Machine_learning
Deep learning (DL) is strictly put any neural network with more than two hidden layers.
Attempt to categorize neural nets by The Asimov Institute in 2016�https://www.asimovinstitute.org/neural-network-zoo/
Common ML/DL frameworks
Most are written in
This doesn’t count the SaaS/PaaS/MLaaS offerings out there like
Or big-data, cluster-based solutions like
Common UX for ML frameworks
Jupyter and derived UIs are the prevalent interface used to analyze data quickly and prototype ML models.
The “notebook” concept has become commonplace, even outside ML and data science.
Elixir from 10k Feet
We’re not going to learn Elixir in this session, but want to lay the groundwork for later code examples.
If you know it already, yay!
If not, I hope it catches your interest and you check it out!
Elixir is a functional language
I mean functional as in programming paradigm, not as in useful (which it is!).
Nearly everything in Elixir is a term under the hood.
Terms are immutable; once created they cannot be changed, and are destroyed when their reference count goes to zero.
Elixir embraces the Actor model and makes heavy use of message passing to provide concurrency and distribution as first-class concepts.
The BEAM (Erlang/Elixir VM) is a true virtual machine - applications don’t spin up threads, they spin up processes since they own the VM. And the BEAM can easily support thousands of those, running across multiple cores, out of the bag.
Elixir has deep roots
Elixir was release in 2014 by José Valim (of RoR fame).
Built on top of the Erlang VM, it leverages a battle-hardened core that is older than most contemporary programming languages (10 years older than Java), yet is under active development and makes frequent releases to keep the Erlang ecosystem highly competitive.
Elixir borrows “programmer happiness” concepts from Ruby, often making it more approachable to new developers than Erlang - without forgetting its roots or compromising the BEAM.
Basic code organization
In Elixir you build modules, which provide a namespaced wrapper for structs and functions.
There are no constructors per se; you assign values to terms and then pass those into functions to operate on them.
There is a simple concept of public vs private functions - public functions are callable from any other module, private functions only within their module.
The “=” sign is a match operator, not an assignment operator in the traditional sense… Fun to wrap your head around, powerful once you grok it!
defmodule Calculator do
defstruct [:x, :y]
def add(%Calculator{x: a, y: b}) do
a + b
end
end
params = %Calculator{x: 1, y: 2}
Calculator.add(params)
# What do you think happens here?
4 = Calculator.add(params)
More advanced features
MACROS!
Open Telecom Platform (OTP)
Distribution
Integration with other languages
Goodness inherited from the Erlang Run-Time System (ERTS)
Axon, Nx and Livebook at Ground Level
Now let’s see what kind of machine learning we can do with Elixir!
Nx: Numerical Elixir
Released in Spring 2021
Addresses the need for high-performance numerical computation in Elixir applications
Provides for high-performance computational graphs operating on tensors to be compiled to CPU, GPU or TPU via the companion EXLA backend project
Open architecture supports alternate compute backends; some being explored are libtorch and ONNX
Axon
Released in Spring 2021
Developed to provide a native deep learning framework for Elixir
Leverage the features provided by Elixir and BEAM to sidestep common issues in other languages and platforms (concurrency, distribution - scaling in a word)
Can also leverage the same backends that Nx leverages for targeting CPUS, GPUS and TPUS
Livebook
Released in Spring 2021
Provides a native notebook experience for Elixir users
Written in Elixir, it is able to leverage all the features in the ecosystem like Phoenix, LiveView, OTP, concurrency, and distribution without making compromises
With the VegaLite and Kino packages, new levels of visualization and interactivity are possible compared to other notebook systems
Livebook “notebooks” are appropriately referred to as livebooks
Scidata
The Scidata package provides straightforward APIs to retrieve common data science datasets
Currently supports
The team is always open to pull requests to add more datasets!
VegaLite
Provides integration of Vega-Lite visualizations into Livebook
Full support for all Vega-Lite features
Rich bindings to Elixir data structures
Kino
Provides a flexible architecture for interactive widgets in Livebook
Current bindings allow visualizations provided by VegaLite to become dynamic
Leverages the concurrency mechanisms available in Elixir to seamlessly run interactive components in the background
Explorer
Relatively new project in the Nx ecosystem
Provides dataframe interface for analyzing and manipulating large datasets (think pandas but waaaay faster)
Provides a flexible architecture to support different data loader backends; currently leveraging polars.
Ready to see this stuff in action?
Thanks!
(Can’t have an AI talk without a gratuitous robot pic, amiright?)
Code can be found on GitHub
�arpieb/deep_learning_elixir