Clustering Text Using Attention
Lovedeep Singh
Agenda
Attention Mechanism
Clustering Techniques
NLP pipeline
Methodology
Results
Discussion
Summary
Clustering Text Using Attention
2
06/07/2021
Attention Mechanisms
Clustering Text Using Attention
3
06/07/2021
Clustering Techniques
Clustering Text Using Attention
4
06/07/2021
NLP pipeline
Clustering Text Using Attention
5
06/07/2021
Methodology
Clustering Text Using Attention
6
06/07/2021
Hierarchical Attention Network
Clustering Text Using Attention
7
06/07/2021
Methodology
Clustering Text Using Attention
8
06/07/2021
Methodology
Clustering Text Using Attention
9
06/07/2021
Methodology
Clustering Text Using Attention
10
06/07/2021
Results
Clustering Text Using Attention
11
06/07/2021
Avg.Ev. = (homo+comp+var+ari+ami+silh)/6
Results
Clustering Text Using Attention
12
06/07/2021
Results
Clustering Text Using Attention
13
06/07/2021
Discussion
Clustering Text Using Attention
14
06/07/2021
Summary
It is evident that clustering using attention mechanism indeed help in the overall performance of the clustering algorithm. The performance improves with increase in fraction of data used for attention training. We have used Hierarchical Attention Networks for our experiment, there could be other ways to incorporate attention mechanism in the pre-clustering pipeline. Self-attention and attention used in Transformers could be another possible way. This paper tries to shed light into the less explored possibilities in the clustering field.
Clustering Text Using Attention
15
06/07/2021
References
Clustering Text Using Attention
16
06/07/2021
Thank you
Complete code, dataset details and results are available at https://github.com/singh-l/CTUA �for further experiment and analysis
Clustering Text Using Attention
17
06/07/2021