Research Notes:
Gravitational Waves
Machine Learning Analysis
Specific Focus: Detecting/Estimating Eccentricity
With Dr Elaha Khalouei & Prof Hyung Mok Lee
Center for Gravitational-Wave Universe, SNU
Inspiral Events
Current templated searches for gravitational waves (GWs) emanated from compact binary coalescences (CBCs) assume that the binaries have circularized by the time they enter the sensitivity band of the LIGO-Virgo-KAGRA (LVK) network.
However, certain formation channels predict that in future observing runs (O4 and beyond), a fraction of detectable binaries could enter the sensitivity band with a measurable eccentricity e.
Formation mechanisms for eccentric inspiral events
Samsing, MacLeod, & Ramirez-Ruiz
ApJ 784, 2014
In the near future it is expected that a significant number of inspiral events with non-negligible eccentricity will be detected by LIGO-VIRGO
Machine Learning: Eccentricity
We are attempting to create a fast detection method to determining if a GW event has eccentricity
We generate events with realistic noise for various
We use Python based code: PyCBC (compact binary coalescence, CBC)
Approximate numerical schemes
Machine Learning: Eccentricity
e=0.0
e=0.4
Spectrogram: Distance to Source
GWs are like the luminosity distance
M1 = M2 = 10M☉
D= 100Mpc
D= 500Mpc
D= 1000Mpc
D= 2000Mpc
The more distant the source, the weaker the signal, the more difficult to detect
Spectrogram: Eccentricity
e=0.0
e=0.1
M1 = M2 = 10M☉
Distance = 100Mpc
Spectrogram: Eccentricity
e=0.0
e=0.4
M1 = M2 = 10M☉
Distance = 100Mpc
CNN
We work with 2D spectrograms (256x256) i.e. (time x freq)
Generate 20,000 events with � - Masses in Uniform(10 , 80) Msun� - Distances Uniform(100 , 2000) Mpc� - eccentricities in Uniform(0.0 , 0.4)�
Split into 90%/10% training and testing
Define a 4 layer CNN + ANN as regression for event eccentricity�Other event parameters are essentially marginalised over / ignored.
CNN: Preliminary result
Don't like these departures from the true values at high and low ecc.
So I will try a custom loss function that gives extra weight to the Mean-Squared-Error (MSE) for high and low ecc
CNN: Preliminary result
CNN: Preliminary result
Test the trained network
ecc=0
ecc=0.1
ecc=0.2
Can we reduce the widths of these distributions
-> lower final errorbar
Transfer Learning (ImageNET)
Transfer Learning
Fine Tuning
Model = Xception (20M parameters)
Transfer Learning (ImageNET)
Transfer Learning (ImageNET)
Transfer Learning
Fine-Turning
Transfer Learning (ImageNET)