cs236781-projects-winter2020
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
£
%
123
 
 
 
 
 
 
 
 
 
ABCDEFGHIJKLMNOPQRSTUV
1
Proposed Project TopicsGroup 1Group 2Group 3
2
#Project Category / NameDescriptionPaper linkStudent 1Student 2Student 1Student 2Student 1Student 2
3
Sequence models
4
Low-rank attentionUnderstanding the importance/verifying the existence of low-rank structure of the embeddings in self-attention networks. https://arxiv.org/abs/1905.13655 Matan Weksler Ido glanz
5
Inverse problems
6
Meta-learning deep image priorLearn optimal input and optimal loss function in deep image prior.https://arxiv.org/abs/1711.10925Ori SztyglicAmir belderAviv CaspiRoi Tzur-Hilleli
7
ODE solvers for inverse problemsImprove Neuman Networks paper by using the adjoint sensitivity method used in Neural ODEs to build better inverse problem solvershttps://arxiv.org/abs/1901.03707
8
Differentiable-XThe desire for end-to-end learning has prompted multiple groups to suggest differentiable alternatives to classical non-differentiable optimization problems. these, in turn can be used as part of a larger network and suppposedly improve results. Such problems include sorting, k-means clustering, eigen/svd decomposition (i.e., matrix pseudo-inverse, closest rank-k approximation, closest orthonormal matrix), orthogonalization, and a variety of combinatorial optimization problems. The goal of the following projects is to understand these methods, how they overcome the non-differentiability of the original problem, and to (i) use them for any new applications (and/or) (ii) test their stability and try to break them. (Implementation is available for all the papers)More information via appointment.
9
Backpropagation-friendly eigendecompositiondifferentiable EVDhttps://github.com/cvlab-epfl/Power-Iteration-SVD
10
End-to-end learning and optimization on graphsdifferentiable k-meanshttps://arxiv.org/abs/1905.13732mohamed ayoubady agbarih
11
Optimizing rank-based metrics using blackbox differentiationdifferentiable sortinghttps://arxiv.org/abs/1912.03500
12
Differentiation of Blackbox Combinatorial Solversdifferentiable traveling-salesman/shortest-path/graph-matchinghttps://arxiv.org/abs/1912.02175Assaf YaffeYaniv Goft
13
Neural Ordinary Differentiable Equationsdifferentiable ODE solvershttps://github.com/rtqichen/torchdiffeq
14
Differentiable digital signal processingdifferentiable DSP: explore and understand paper/the current library, enrich the library with additional operations or build cool applications using it.https://github.com/magenta/ddspYakir HeletsGal FleissigAmit ZukierTal Skverer
15
Domain adaptation, semi supervised learning
16
DeepInversion of random subnetworks
DeepInversion allows to reconstruct instances of training set from the trained networks. We want to apply it to well-performing randomly initialized networkhttps://arxiv.org/abs/1911.13299
https://arxiv.org/abs/1912.08795
Arnon HevlinDaniel NachmiasRafi CohenAdam Botach
17
Compression of DNNs
18
MLE quantizationThis work aims to devolop novel post-training quantization finding MLE of quantized weightshttps://www.dropbox.com/s/2hzrhgm49xyxr5m/MLE_Quantization.pdf?dl=0
19
Adversarial attacks
20
MixUp + Adversarial trainingRecently a number of augumentation that heavily alter the labels were proposed. We want to study how it affects adversarial training.http://proceedings.mlr.press/v97/verma19a.html
https://arxiv.org/abs/1912.02781
https://arxiv.org/abs/1710.09412
Gal KorciaYonatan Gat
21
Perceptually aligned gradientsperceptually aligned gradients - when does this repreduce? and why? does it depend on diffrent attacks and norm limitations?
We wish to study the above questions, espesially interesting are EPGD attacks as described in the second link
https://arxiv.org/pdf/1910.08640.pdf
https://arxiv.org/pdf/1911.07198.pdf
22
Weighted smoothingCurrently the weights which produce the best result are 1 for the top ranking class and 0 for the rest, there are many ways to set the weights such as based on classes prior and diffrenet methods which are based on results for certified roubstness. we want to find weights which produce bettter results than simple voting, and to understand the dynamic in the systemhttps://arxiv.org/pdf/1911.07198.pdfLina MaudlejBoris Van SosinMuhammed KiwanAbdallah Yassin
23
Choosing of an EPGD trained adverserial modelIf we train a smooth classifier on EPGD attacks and choose the model with best results on EPGD (on test) we can get a model with up to 65%~ accuracy on PGD attacks. However as we chose the model based on EPGD results on the test set the results are unreliable. we wish to correctly estimate the accuracy but do not want to set a validation set for that porpose as this will lower the results. Notice that choosing the model in this way is a form of overfitting which we are trying to avoid, we therefore want to choose the model in a way that is not overly dependant on the test set and to correctly estimate the accuracy of the model on PGD and EPGD, however we do not want to set a validation set for that porpose as this will lower the results.https://arxiv.org/pdf/1911.07198.pdf
24
Training a model for smoothingwe want to train a CPNI model adverserially while considering it will be used for smoothing from the beginning, we can consider a combination of PGD and EPGD attacks. we want to choose the model which achieves the best results on EPGD on the test sethttps://arxiv.org/pdf/1911.07198.pdfRaïssa NatafItay Eilat
25
Gauss model for smoothing: MacerGauss model was described in the paper as one of the methods to train a model for smoothing, this method currently does not show improvement however it seems to change the optimal noise of the base CPNI model. perhaps by better optimizing the noise injectes during the training process we align the optimal noise of the base CPNI model and the gaussian trained model and therfore achieve better results. combining the gaussian training and EPGD training could be one way to do sohttps://arxiv.org/pdf/1911.07198.pdfDavid BensaidAntonio Abu Nassar
26
Soft smoothingsoft smoothin g does not produce better results than the voting method, the reason could be that the probabilities in the output are not correctly calculated. we want to better approximate the real probabilities of the network and use that to improve the said smoothing method. there are many work which discuss how to better approximate the probabilities in the output https://arxiv.org/pdf/1911.07198.pdf
https://arxiv.org/pdf/2001.02378.pdf
27
Graph adversarial attacksIn this project we aim to devolop attacks/defences methods for GCNNshttps://openreview.net/forum?id=B1eXygBFPHJeries Abu KhadraReda Igbaria
28
Denoiser adversarial attackIn this project we aim to devolop attacks/defences methods for regression tasksmore information via appointment
29
DL for plant genomics
30
Deep learning for PlantsDevelop machine learning for dynamic plant traits.We are developing AI tools for plant phenotyping, bioinformatics, and genetic associations. Up-to-date, we constructed a visualization setup to track plants' dynamic traits and refined NN tools for image processing. Now, we aim to build NN models to identify similarities and differentiating visual attributes and employ them to recognize genetic similarities in tomato collections.

Reference articles:
https://www.plant-phenotyping.org/CVPPP2019
https://plan.core-apps.com/pag_2020/event/91d41f080d91bca973636b572e63f796
more information via appointment
31
Analysis of physiological signals
We focus on the application of deep learning for the analysis of physiological signals, specifically electrocardiograms (ECG), their beat intervals signals (RR intervlas), and also photoplethismogram (PPG) signals obtainable from wearble smart watches.more information via appointment
32
Conditional generative models for RR-intervalsDevelop a generative model that can generate realistic sequences of RR intervals, conditioned on various factors such as age, sex, pathologies.
33
Estimating RR intervals from PPG using ECG alignment
Develop a deep-learning based approach to estimate RR intervals directly from PPG data, by training with both PPG and ECG.Boaz LavonRotem Kuehnberg
34
Attention-based arrhythmia detection from RR intervalsDetect set-level rhythm types in RR-interval sequences.
Maya LevitalOr Avnat
35
Representation learning with deep clustering of single-beats from ECGApply deep-clustering approaches to create meaningful representations of ECG beat morphology.Ron Ziv
36
Reimplementation projectsThese projects require you to reimplement the existing paper and study/reproduce the results provided by authors. You can use exisisting code, if exists, as a reference only. You are required to propose some change/improvement to the method and perform exhaustive verification and analysis of your results.
37
YOLACT: Real-time Instance Segmentationhttps://arxiv.org/abs/1904.02689nathan bellalouAnnael AbehsseraDafna RegevDavid Ben Zacharia
38
SWALP: Stochastic Weight Averaging in Low-Precision Traininghttps://arxiv.org/abs/1904.11943Nader MeraiEmil Khshiboun
39
Adaptive Loss-aware Quantization for Multi-bit Networkshttps://arxiv.org/abs/1912.08883
40
WaveGlow: A Flow-based Generative Network for Speech Synthesishttps://arxiv.org/abs/1811.00002 sahar protereran flashinIdan WeizmanDvir KinerBarak Gahtan
41
Variance Networks: When Expectation Does Not Meet Your Expectationshttps://arxiv.org/abs/1803.03764
42
BatchEnsemble: an Alternative Approach to Efficient Ensemble and Lifelong Learninghttps://openreview.net/forum?id=Sklf1yrYDr
43
Stacked Capsule Autoencodershttps://arxiv.org/abs/1906.06818
44
Adversarial Training and Provable Defenses: Bridging the Gaphttps://openreview.net/forum?id=SJxSDxrKDr
45
Object as Distributionhttps://arxiv.org/abs/1907.12929
46
Meta-Learning with Implicit Gradientshttps://arxiv.org/abs/1909.04630
47
Learning to Segment via Cut-and-Pastehttps://arxiv.org/abs/1803.06414Manor ZviTom AvrechIdo nutovAvishay CohenNoam KahanOmer Atzmon
48
Momentum Contrast for Unsupervised Visual Representation Learninghttps://arxiv.org/abs/1911.05722amit weilavia asael
49
Learning to Detect and Retrieve Objects from Unlabeled Videos
yaniv ziselman
50
51
52
53
barak
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
Loading...