Scalable Hyper-parameter Optimization using RAPIDS and AWS
AKSHIT ARORA
SRISHTI YADAV
SPEAKER: AKSHIT ARORA
\Uhk_Shith Aurora\
2
SPEAKER: SRISHTI YADAV
๐ srishti.dev
3
3
WHY THIS TALK?
4
WE ALL LOVE TO SPEED UP THINGS
BUT ARE WE ๐โโ๏ธ๐โโ๏ธ THINGS FASTER ?
5
COMPANIES LOVE TO STORE AND USE DATA
BUT ARE THEY REALLY CASHING IT ?
6
WE KNOW CLOUDS ARE A THING
BUT WE DONโT KNOW IF ITโS WITHIN REACH ๐คฃ
7
HOW ARE WE HELPING?
8
โก๐โโ๏ธ๐โโ๏ธ : Papermill
๐ฐ๐ฐ๐ฐ : GPU
โ๏ธ๐ฅ๏ธโ๏ธ ๐ฅ๏ธ : AWS
9
IT CAN BE AN OCEAN OF DETAILS
BUT BEFORE WE IN
10
Hyper-Parameter Optimization (HPO)
For 5 parameters, each with 4 desired values, we will have 45 possible combinations
11
WHAT ARE WE GOING TO DO?
12
CPU Based Scalability
Learn to parameterize notebooks from a set of parameters
You donโt want to do all of them manually! ๐คฆ
Automate parallel computation of parameterized notebooks.
Since we use CPU here, you can play with it at home too, without any cost.
13
Papermill lets you:
Helpful for running HPO tasks where the same model needs to be trained again and again with different sets of parameters.
14
DEMO TIME
15
GPU Based Scalability
Learn to speed up the computation using GPU
Make efficient use of GPU using RAPIDS
Scale up the computation and have better metric visualization using AWS
Learn to do hyper parameter optimization to find best parameters from given set
You donโt want to iterate over all of them manually, trust us!
16
17
(Docs: https://rapids.ai/hpo)
18
(Docs: https://rapids.ai/hpo)
19
DEMO TIME
20
Thank you!
Please find our code, slides and other resources at our github page:
21