Paper The following article is Open access

Does spacetime have memories? Searching for gravitational-wave memory in the third LIGO-Virgo-KAGRA gravitational-wave transient catalogue

, and

Published 7 May 2024 © 2024 The Author(s). Published by IOP Publishing Ltd
, , Focus on Gravitational-Wave Memory Effects: From Theory to Observation Citation Shun Yin Cheung et al 2024 Class. Quantum Grav. 41 115010 DOI 10.1088/1361-6382/ad3ffe

0264-9381/41/11/115010

Abstract

Gravitational-wave memory is a non-linear effect predicted by general relativity that remains undetected. We apply a Bayesian analysis framework to search for gravitational-wave memory using binary black hole mergers in LIGO-Virgo-KAGRA's third gravitational-wave transient catalogue. We obtain a Bayes factor of $\ln \text{BF} = {0.01}$, in favour of the no-memory hypothesis, which implies that we are unable to measure memory with currently available data. This is consistent with previous work, suggesting that a catalogue of $\mathcal{O}(2000)$ binary black hole mergers is needed to detect memory. We look for new physics by allowing the memory amplitude to deviate from the prediction of general relativity by a multiplicative factor A. We obtain an upper limit of A < 23 (95% credibility).

Export citation and abstract BibTeX RIS

Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 license. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

1. Introduction

The landmark detection of gravitational waves from the merger of a binary black hole in 2015 by the LIGO-Virgo Scientific Collaboration has provided new methods to test general relativity and fundamental physics [1, 40]. However, a particularly interesting phenomenon predicted by general relativity remains unconfirmed: gravitational-wave memory. Linear memory was first predicted by Zel'dovich and Polnarev and is produced from unbound systems such as hyperbolic orbits, supernovae and triple black hole interactions [10, 47]. In 1991, Christodoulou identified a significant non-linear memory component in bound systems, such as compact binary mergers [11]. Non-linear memory arises from the gravitational waves themselves, resulting in an accumulation of memory that physically manifests as a permanent displacement between test masses following the passage of gravitational waves [42].

The displacement memory signal has not yet been directly observed because the amplitude of the memory is only around ${\lesssim}5\%$ of the oscillatory waveform amplitude for a GW150914-like event [23]. Due to the low amplitude of memory, a direct detection of memory from a single event is improbable with current gravitational-wave detectors, unless observatories detect a surprisingly close (≈20 Mpc) binary black hole event [16, 22, 23]. Therefore, previous work focuses on detecting memory in the entire population of gravitational-wave events, rather than a single event [23]. Searches for memory have been carried out with data from the first [18] and second [17] gravitational-wave transient catalogue (GWTC). No evidence of memory was found and reference [17] showed that definitive evidence of memory is likely to require an ensemble of $\mathcal{O}(2000)$ gravitational-wave events [16, 17]. Proposed future gravitational-wave detectors such as Cosmic Explorer [28], the Einstein Telescope [27] and LISA [6] may be able to directly detect memory from a single event [16, 21].

Theoretical work has shown that gravitational-wave memory is connected to the Bondi–Metzner–Sachs (BMS) symmetry group and Weinberg's soft graviton theorem in quantum field theory [34, 45]. Each of these three seemingly unrelated concepts represent a corner in the so-called 'infrared triangle' [33]. These connections may serve as a possible bridge between general relativity and quantum field theory, and can be used to test spacetime symmetries [15]. These connections to asymptotic symmetries and soft theorems may provide insight to the black hole information paradox [19].

Recent work seeks to test if the memory amplitude is consistent with predictions from general relativity [15, 31]. The premise of these studies is that new physics could produce deviations from general relativity that may lead to a different memory amplitude [20]. Other work explores how the inclusion of memory may help to improve the accuracy of gravitational-wave parameter estimation [14, 24, 46]. Still other publications have discussed the possibility of identifying subsolar-mass compact binary mergers [12] and using memory to distinguish between neutron star-black hole binary and binary black hole mergers [43].

In this paper, we perform a search for gravitational-wave displacement memory using 88 events from LIGO–Virgo–KAGRA's third GWTC-3. Since we are well short of the ${\approx}2000$ events that are expected to be required to detect memory, we view this paper as an ongoing effort to refine the memory detection pipeline and identify potential problems early. In order to search for new physics, we constrain the memory scale factor, which is A = 1 in general relativity. We show that this search is complicated by low-frequency, non-Gaussian noise. We discuss possible remediation strategies.

The remainder of this paper is structured as follows. In section 2, we describe our search for gravitational-wave memory using data from GWTC-3. In section 3, we describe our search for physics beyond general relativity and our constraints on the memory scale factor A, which is expected to be A = 1 for general relativity. In section 4, we summarise our results and discuss future research.

2. Search for gravitational-wave memory with GWTC-3

We follow the method laid out in references [17, 18], calculating a memory versus no-memory Bayes factor for each event and then adding the log Bayes factors. However, in this work we use a different waveform. [17] used two waveform approximants: IMRPhenomXHM [13] to cover the extreme mass ratios and NRSur7dq4 [44] spin precession effects. In this work, we use a single waveform, IMRPhenomXPHM [26], which includes extreme mass ratios, spin precession effects, and includes several of the most dominant higher-order modes [37].

The memory component of the gravitational-wave waveform is calculated from the oscillatory component of the waveform by using the gwmemory Python package [37]. Our analysis consists of two models, a no-memory hypothesis in which our waveform contains only oscillatory wave and a memory hypothesis in which our waveform has both the oscillatory wave and the memory. We calculate a memory Bayes factor, which is a ratio of the Bayesian evidence values computed for our two hypotheses:

Equation (1)

We use data from the Advanced LIGO H1 observatory in Hanford, WA, the LIGO L1 observatory in Livingston, LA [38] and the Virgo observatory in Italy [5].

To calculate the Bayes factors, we use parameter estimation results obtained using the bilby [7, 29] implementation of dynesty [32]. Where possible, we use results available on the Gravitational-Wave Open Science Centre [48]. However, bilby results are not available for the events GW190725, GW190728, GW190917 and GW190924, and so we generate new results from scratch. We omit binary neutron star mergers GW170817 and GW190425. The IMRPhenomXPHM waveform model does not take into account neutron star physics and, at any rate, low-mass binaries produce relatively less memory, making these events expendable for this analysis.

After performing parameter estimation with the oscillatory waveform, we employ importance sampling on each event in order to reweight the n samples with the oscillatory+memory likelihood [25]:

Equation (2)

Here, the weights w are the likelihood ratios comparing our two hypotheses

Equation (3)

where di is the data for event i and θk are the parameters associated with posterior sample k. We employ a minimum frequency of 20 Hz. The total Bayes factor for GWTC-3 is simply

Equation (4)

We consider $\ln \text{BF}^{\text{tot}}_{\text{mem}}\unicode{x2A7E} 8$ to be a detection of memory [23].

In figure 1, we plot the cumulative Bayes factor as a function of the chronological event number. Some events are more informative than others causing comparatively large jumps. The final value is $\ln \text{BF}^{\text{tot}}_{\text{mem}} = {-0.01}$, which is too small to favour one hypothesis over the other. This is expected as ${\cal O}(2000)$ events are needed before we expect to have the statistical power to distinguish between these two hypotheses [17]. The uncertainty from the reweighting method is ${\lt}0.01$, much less than the threshold of detection of 8, see bottom panel of figure 1 of [18].

Figure 1.

Figure 1. Cumulative natural log Bayes factor $\ln\text{BF}^{\text{tot}}_{\text{mem}}$ as a function of the number of binary black hole mergers. Large positive values indicate support for the existence of memory while large negative values indicate support for the no-memory hypothesis. The current data are inadequate to differentiate between these two hypotheses.

Standard image High-resolution image

The Bayes factors for individual events are displayed in figure A1 of the appendix, which includes a comparison with previous results and an explanation for differences between this work and reference [17]. We omit event GW190424 from our analysis, as it was not deemed a significant event for inclusion in GWTC-2.1 [41], despite being present in previous searches for memory [17].

3. Search for new physics with non-standard memory

In order to search for new physics, and following references [15, 31], we allow the amplitude of the memory to vary by a multiplicative factor A so that the gravitational-wave signal is

Equation (5)

By construction, general relativity predicts A = 1. However, in this framework, we speculate that new physics—perhaps related to BMS symmetry [9, 30]—leads to a waveform with A ≠ 1. We assume that A is the same for each event and calculate the posterior for A given the events in GWTC-3:

Equation (6)

We take the prior $\pi(A)$ to be uniform on the interval (0, 400).

The posterior distribution for A (calculated with all of the events in GWTC-3) is shown in figure 2. It is consistent with the general relativity prediction of A = 1. We set a 95% upper limit of A = 23. Again, we employ a minimum frequency of 20 Hz.

Figure 2.

Figure 2. The posterior for A given the 88 events in our analysis (blue). The posterior is consistent with A = 1 (dashed line) predicted by general relativity. The 95% credible interval (dark blue) yields an upper limit of A = 23. To better constrain A, we remove 3 events most affected by non-stationary noise (GW170104, GW170818, GW200128) from the posterior (green), with an upper limit of A = 15.

Standard image High-resolution image

In the course of carrying out this analysis, we noticed that, for some events, the posterior favours large values of A. The blue distribution in figure 3 shows this behaviour for one such event (GW170818), which favours A ≈ 300 over A = 1 with a likelihood ratio of ${\approx}20$. While it is expected that some events will produce posteriors peaked away from A = 1 due to noise fluctuations, we do not observe such large fluctuations when we repeat the analysis using simulated Gaussian noise.

Figure 3.

Figure 3. The posterior of A for GW170818. The posterior calculated from a 20 Hz (50 Hz) high-passed data is shown in blue (green). The 50 Hz high-passed posterior has a stronger support for A = 0 and A = 1 and favours a smaller A.

Standard image High-resolution image

We confirm that this behaviour is not due to real A = 300 memory by analysing 'off-source' data where no oscillatory gravitational-wave signal has been detected. We analyse the off-source data with a memory-only model with no oscillatory component and calculate the posterior for $A$. We observe a similar pattern with ${\approx}3/88$ fake events exhibiting large fluctuations away from A = 1. We conclude that unmodeled non-Gaussian noise in the LIGO–Virgo data is affecting our posterior for A 3 . In hindsight, this is perhaps not surprising as non-stationary noise is known to lurk at low frequencies where memory is most pronounced [3, 39].

In figure 4 we provide a visualisation to show how non-Gaussian noise can yield high likelihood values for large values of A. Each panel is a time series of whitened strain. Blue is Livingston L1 data for GW170818. The expected memory waveform is in green. Since we include only frequencies within the LIGO–Virgo observing band (above of minimum frequency of 20 Hz), the memory does not induce a DC offset, but instead produces a short-duration wave packet; see also [23]. The expected oscillatory + memory waveform is in orange.

Figure 4.

Figure 4. A plot of the full (osc+mem) and memory (mem) waveform with the whitened Livingston (L1) data for GW170818. As the A increases, the full waveform fits increasingly well to the whitened data.

Standard image High-resolution image

The top panel shows the expected A = 1 waveform predicted by general relativity. For this particular event, the memory is negligible. However, the oscillatory + memory waveform is not well-matched to the data at the moment of peak strain. This discrepancy can be plausibly explained as a noise fluctuation since this deviation between orange and blue is not unusual compared to the fluctuations in the noise at late times after the gravitational-wave signal has passed. The second and third panels show the same plot with A = 100 and A = 300. By increasing the memory amplitude, the waveform better fits the noise fluctuation. Since the memory signal is so short in duration, this does not spoil the fit with the earlier inspiral phase. Viewed in the time domain, it is apparent that the short, memory impulses are similar to broadband, low-frequency non-Gaussian noise.

Next, we carried out an investigation in order to identify the frequency band where this non-stationary noise is most pronounced. We create a distribution of strain of different frequency bins from 100 random segments of data and fit each distribution with a Gaussian function. We expect that all frequency bins contain some non-Gaussian noise. However, we find the 20 Hz frequency bin to be especially non-Gaussian with reduced chi-squared value of $\bar{\chi}^2 = 2\times10^5$ whereas the 50 Hz frequency bin $\bar{\chi}^2 = 1.6$ is more consistent with a Gaussian noise. Testing regularly spaced frequency bins, we conclude that the non-Gaussian noise is most pronounced in the band: 20–50 Hz. This motivates us to see how the results change when we increase the minimum frequency from 20 Hz to 50 Hz.

In figure 3 we compare the posterior for A calculated using $f_\text{min} = $ 20 Hz (blue) and $f_\text{min} = $ 50 Hz. The posterior calculated with $f_\text{min} = $ 50 Hz is consistent with A = 1. This is consistent with our hypothesis that the A posterior is biased by non-Gaussian noise in the 20–50 Hz band. We recalculate the posterior distribution by removing the three events that appear to suffer from non-stationary noise the most (GW170104, GW170818 and GW200128) and obtain the green curve in figure 2. The resulting upper limit on A is reduced to A = 15. Although removing the low-frequency data removes the non-Gaussian noise, it also removes part the memory signal, reducing the optimal SNR by 10-66%. This reduces our ability to detect memory and to constrain A.

We consider various solutions to deal with the non-Gaussian noise at low frequencies. Instead of throwing out the 20–50 Hz, one could model the non-Gaussian noise by developing a more sophisticated likelihood function. In this approach the likelihood down-weights the data affected by the non-Gaussian noise as it is less trustworthy. The disadvantage of this approach is that the the down-weighting still reduces the sensitivity of the search, though, not as much as removing the low-frequency data entirely. The best solution is to reduce non-stationary noise with commissioning. Investigating this possibility is a goal for future research.

4. Conclusion

With LIGO-Virgo-KAGRA's fourth observation run already underway and the fifth observation run is planned to start in 2027, the number of gravitational-wave events will greatly increase. It is expected we will reach the ${\approx}2000$ events needed to detect memory during the fifth observing run. Non-Gaussian noise between 10–50 Hz needs to be better understood, otherwise the required number of events to detect memory may be larger. We suggest mitigating the non-Gaussian noise either through detector commissioning or by developing a model for non-Gaussian noise.

Acknowledgments

We would like to thank Moritz Huebner and Colm Talbot for helpful discussions. This research is supported by Australian Research Council (ARC) Centre of Excellence for Gravitational-Wave Discovery CE170100004 and CE230100016, Discovery Projects DP220101610 and DP230103088, and LIEF LE210100002. The authors are grateful for computational resources provided by the LIGO Laboratory and supported by National Science Foundation Grants PHY-0757058 and PHY-0823459, and the OzSTAR Australian national facility at Swinburne University of Technology. This material is based upon work supported by NSF's LIGO Laboratory which is a major facility fully funded by the National Science Foundation.

Data availability statement

The data that support the findings of this study are openly available at the following URL/DOI: https://gwosc.org/eventapi/html/GWTC/.

Appendix: Bayes factors for individual events

We compare our $\ln \text{BF}_{\text{mem}}$ values for GWTC-1 [2] and GWTC-2 [4] with the previous search of memory in [17], as shown in figure A1. For most events, our Bayes factor computed using IMRPhenomXPHM (blue dots in figure A1) is very close to [17], computed using IMRPhenomXHM and NRSur7dq4, hence are in agreement within waveform systematics.

Figure A1.

Figure A1. The individual $\ln \text{BF}_{\text{mem}}$ for all 88 binary black hole mergers. Our results computed using IMRPhenomXPHM are shown as blue dots, while the results from [17] computed using IMRPhenomXHM and NRSur7dq4 are shown as pluses and crosses, respectively.

Standard image High-resolution image

However, there are a few individual events where the Bayes factors are noticeably different. The differences in our results can be attributed to several factors. The largest factor is the changes made in the gwmemory package between the time of [17] and now. The most significant change was switching to an analytic version of the mode amplitudes using Clebsch–Gordan coefficients, aligning memory to the SXS memory prediction. A second factor may be due to systematic difference between the different waveform approximants. To find the difference between waveforms, we run parameter estimation on GW190924, which has the greatest difference in Bayes factor, using the same waveform as [17], IMRPhenomXHM. The difference between our Bayes factor forIMRPhenomXHM and IMRPhenomXPHM is ${\approx}0.01$.

Footnotes

  • We consider two other hypotheses that might explain why the posterior prefers large values of A for some events. First, we do not take into account uncertainty in our estimation of the noise power spectral density [8, 35]. Second, we do not take into account correlations between frequency bins that arise from so-called finite-duration effects [36]. Thus, our likelihood is slightly misspecified due to approximations we make about the noise. However, we rule out these explanations because we do not see posteriors that favour large values of A when we analyse Gaussian noise with the same slightly misspecified likelihood.

Please wait… references are loading.
10.1088/1361-6382/ad3ffe