#101 Black Holes Collisions & Gravitational Waves, with LIGO Experts Christopher Berry & John Veitch - podcast episode cover

#101 Black Holes Collisions & Gravitational Waves, with LIGO Experts Christopher Berry & John Veitch

Mar 07, 20241 hr 10 minSeason 1Ep. 101
--:--
--:--
Listen in podcast apps:

Episode description

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!


In this episode, we dive deep into gravitational wave astronomy, with Christopher Berry and John Veitch, two senior lecturers at the University of Glasgow and experts from the LIGO-VIRGO collaboration. They explain the significance of detecting gravitational waves, which are essential for understanding black holes and neutron stars collisions. This research not only sheds light on these distant events but also helps us grasp the fundamental workings of the universe.

Our discussion focuses on the integral role of Bayesian statistics, detailing how they use nested sampling for extracting crucial information from the subtle signals of gravitational waves. This approach is vital for parameter estimation and understanding the distribution of cosmic sources through population inferences.

Concluding the episode, Christopher and John highlight the latest advancements in black hole astrophysics and tests of general relativity, and touch upon the exciting prospects and challenges of the upcoming space-based LISA mission.

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.

Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

Takeaways:

 ⁃    Gravitational wave analysis involves using Bayesian statistics for parameter estimation and population...

Transcript

In this episode, we dive deep into gravitational wave astronomy with Christopher Berry and John Vich, two senior lecturers at the University of Glasgow and experts from the LIGO -VIRGO collaboration. They explain the significance of detecting gravitational waves, which are essential for understanding black holes and neutron stars collisions. This research not only sheds light on these distant events, but also helps us grasp fundamental workings of the universe.

Our discussion focuses on the integral role of Bayesian statistics, detailing how they use nested sampling for extracting crucial information from the subtle signals of gravitational waves. This approach is vital for parameter estimation and understanding the distribution of cosmic sources through population inferences.

Concluding the episode, Christopher and John highlight the latest advancements, in black hole astrophysics and tests of general relativity, and touch upon the exciting prospects and challenges of the upcoming space -based LISA mission. So strap on for episode 101 of Learning Bayesian Statistics, recorded February 14, 2024. Hello my dear Bayesians! Today, I want to thank Julio joining the Good Basion tier of the show's Patreon.

Julio, your support is invaluable and literally makes this show possible. I really hope that you will enjoy the exclusive sticker coming your way very soon. Make sure to post a picture in the slide channel. And now, on to the show. Christopher Barry, John Vich, welcome to Learning Basion Statistics. Thank you very much for having us. Yes, thank you a lot for taking the time, even more time than listeners suspect, but we're not gonna expand on that.

But yeah, I'm super happy to have you on the show and we're gonna talk about a lot of things, physics, of course astrophysics, black holes and so on. But first, How would you both define the work you're doing nowadays and how did you end up working on this? I can go first. I guess I'm slightly older than Christopher. I started doing gravitational waves when I was a physics student at Glasgow.

I got involved with the LIGO, actually the GEO experiment first of all, which is the Gravitational Weight Detector in Germany. Its bigger brother is the LIGO and the LIGO detectors that we're going to talk about more today. And ever since then, I mean, thought the project was fantastic. I'll you all about it. I just wanted to get involved in the discoveries of gravitational waves and what they can tell us about black holes and so on.

I got involved back in my PhD. My PhD was largely about gravitational waves we could maybe detect in the future with an upcoming space -based mission called LISA, due for launch in the 2030s. I remember my advisor telling me, I hope you're OK. There's not going to be any real data. And I was like, yes, that's great. I just want to play around with the theory stuff. And then I guess fate conspired against me. After my PhD, I moved to the University of Birmingham.

That's where I first started working with John. We were at University of Birmingham together. And I got involved in LIGO, VEGO data analysis. And we happened to make our first detection just a couple of years after I joined in 2015. And we've been very busy since then analyzing all the signals, figuring out the astrophysics of them. So each individual source and then putting them together to understand the population underneath.

So now we're both at the University of Glasgow working on analyzing these gravitational wave signals and understanding what they can teach us about the universe. Yeah. So as Liesner can already tell, I guess, Fascinating topics, lots of things to talk about and dive into. But maybe to give us a preview of things we're going to talk about a bit more. You guys are also using some patient stats writing in these analysis, am I right?

Yeah, so I think we look at, I guess, two levels of Bayesian stats. So the first is what we refer to as parameter estimation. So given a single signal trying to figure out what are the properties of the source. So the signals we most often see are, say, two black holes spiraling in together. So we look at the patterns of gravitational waves that it emits. And from this, we can match templates and then infer. These are the masses of the two black holes.

This is the orientation of binary, the distance to the binary, and parameters like that. So we use Bayesian stats and the sampling algorithms like nested sampling to mop out a posterior probability distribution. And then I guess the second level of this, we do what we call a population inference, so a hierarchical inference of given an ensemble of different detections, correcting for our selection effects that we can detect some signals easier than others.

What is the underlying astrophysical distribution? So what is the distribution of masses of black holes out there in the universe? Yeah. So fascinating things. And John, you want to maybe add something to that? Just as a teaser, we're going to dive a bit later in the episode into what you guys actually do. So as Christopher just said, nested sampling, population inferences, but anything you want to add? teaser for Easter.

I would add something about the background of how it works within the LIGO scientific collaboration. So when I started doing my PhD, my advisor, Graham Wohn, taught me about Bayesian statistics, Bayesian inference, and I never learned it as an undergraduate at all. I just leave my mind, like, here we have this mathematical theory of learning. Why are we using this everywhere? And in those days, it really wasn't being used very much in LIGO.

because a lot of the people that started the collaboration were coming from a physical perspective and they were very frequentists. They were counting and cutting all of their events to try and measure the discovery that takes place on it or whatever. So it was kind of novel in that patient's way back then. But since then, as Christopher said, it's been applied all over the place to all kinds of different problems. So it's been quite exciting to watch that back over the years.

I remember we had a... We had our first detection and we were lighting up our results. And I think at that time still a lot of the collaboration was very frequent. So we were writing in our papers, we have a posterior probability distribution for the masses and there are people going, hey, what's that? What's a posterior? We've never come across this before. Can you explain it to us? And now it is very much accepted. And yeah, everyone has a new detector. Where are the masses?

I want to see this probability distribution. What do you think drove that evolution and that change? I think Bayesian statistics is very popular in other parts of astronomy. So in a sense, it was kind of inevitable that it would make its way over to gravitational wave astronomy as it's only a matter of time. But I think the problems that we were trying to solve, particularly for the parameter estimation, type analysis did lend itself to a Bayesian analysis because you have a unique event.

You you're not, we only see a very small number of gravitational waves. We see them all the time, but it's still measurable. The dozens, not the millions. So we have to make the most of every single one. The second one, the ratio is rather low. So graphs from the other side. is also very important if you want to do science. Yeah, that makes sense.

And so it was mainly driven by just patient stats entering a need in what you wanted to do basically, which is something I often see in fields where psychology, from what I've seen in the last few years, for instance, psychometrics, things like that, have seen a big rise in patient statistics because they have been able to answer the questions that researchers had and that they could not answer with the tools they had before. So basically, a very practical, oriented view of things.

And then afterwards, let's say the more epistemological philosophical side of things enters to also justify that. But... But most of the time it's a very practical driven mindset, which is great, right? Because in the end, why you care about that is just, is that the right tool to answer the questions I have right now? And yeah, for what it's worth.

Yeah, go ahead, John. The pragmatism is what's put in the table at the end of the day, but during my PhD, I was trying to look... or not the kind of black hole binaries that we'll talk about later, but from monochromatic waves. So you imagine doing a Fourier transform of some data and you have a single spike, and it has a bit of modulation on it. But really there's no information about that spike in any area of the prime space outside of the spike.

So I learned about Bayesian statistics and tried to do MTMC on this problem, which was kind of like the most pathological problem that you'd be trying to do MTMC on. that basically reduces itself to doing an exhaustive search for the ground or space. So I was kind of convinced by the epistemology originally rather than the thesis and the only nature that we used for the sake. Yeah, yeah. Yeah, yeah. That makes sense.

And as you were saying also that patient studies is popular in other parts of physics. That's definitely true in a sense that, for instance, in the core developers of MC, of Stan, you have a lot of physicists, often coming from statistical physics and historically even the algorithms that we even use, MCMC algorithms, have been developed mainly by physicists or for physics purposes. So there is really this integration here almost historically.

And that made me think that if listeners are interested, there is an interesting package that's called Exoplanet. And that's basically a toolkit for probabilistic modeling of time series data in astronomy, but with a focus on observations of exoplanets. So that's different from what you guys do, but that's using PIMC as a backend. So that's why I know it. And that's... mainly developed by Dan, Firm and Macky, if I remember correctly.

I'll put that in the show notes for people who are interested because that is definitely something to check out if you are doing that kind of models. And that made me think that I didn't even thank our matchmaker because today is February 14th, but actually this episode was made possible thanks to a matchmaker, Cupid, if you want, of patient statistics, Johnny Highland. Thanks a lot for putting me in contact with Christopher and John.

Johnny is a faithful listener and I am very grateful for that and for putting me in contact with today's guests. And so you mentioned already, Christopher, that you two work on the LIGO -VIRGO collaboration. Maybe... Yeah, tell us a bit more about that collaboration, what that is about, and what the goal is, so that listeners have a clear background. And then we'll dive into the details. So yes, LIGO -VIRGO -CAGRA is a collaboration of collaborations.

So each of LIGO -VIRGO and CAGRA operate their own gravitational wave detectors. So these are remarkable experimental achievements. We're talking devices that can measure Distortions in space and time is what we're looking for. So in effect, what we do is we time how long it takes a laser to bounce up and down between some mirrors in one direction compared to another. We're looking for a part of less than one part in 10 to the 21.

So it's equivalent to measuring the distance between the Earth and the sun to the diameter of a hydrogen atom, or the distance from here to Alpha Centauri to the width of a human hair. So over many decades, experimentalists have developed the techniques to build these detectors to design them. And we're now in a very fortunate situation that we have multiple of these detectors operating across the world.

So we have two LIGO detectors in the US, one in Livingston, Louisiana, one in Washington, in Hanford. And we've got Virgo in Italy, just outside Pisa, Kagra underground in Japan, and coming decade another LIGO to be built in India. And each of these observatories is looking for gravitational wave signals.

The ideal source for gravitational waves would be a binary of two black holes or two neutron stars, very dense objects coming together, merging very quickly, very strong gravity, very dynamical objects. And we can detect these gravitational waves and with those do astronomy. So instead of using a telescope to make observations with light, we're using these gravitational wave detectors to look for gravitational waves in undercover. the astrophysics of these sources. Yeah. Yeah. Thanks.

So that's a very clear explanation. It's a bit like being able to hear the universe itself only looking at it, right? So that's another way of getting information about the universe that maybe allows us to also answer questions that we had, but we were not able to answer only with a telescope data. Is that the case or is that mainly information that's parallel and similar. Yes. Go ahead.

Yeah, I think that's one of the most exciting things about this is completely set in the electron spectrum using the structural squeezing space itself by these buckles and neutrons. The waves that we've been offering are of an oil from the sea. So, as you said, the detectors are picking up essentially the equivalent of sound waves, bulk motion of the material rather than the jiggling of atoms. We're talking about the jiggling of whole stars, movement of them in their orbits.

And because you're looking at the bulk motion rather than the surface of the object, you can see right into the heart of what's going on in some of these very violent events. In principle, we should be able to see also inside supernovae if there are enough... motion of material during the core collapse. That would also give off gravitational waves that we could see, although their thoughts would be much weaker than those that we're looking at in the moment. I see.

One thing that's particularly nice we can do as well is really test how gravity behaves in very extreme environments. John, I don't know if you want to mention something about looking at the ring down of black holes. Sure. I mean, as Christopher says, there's a very detailed prediction for how two stars should approach each other in their own spiral over time. And the equations are horrendously complicated. talking about the full view of general relativity.

But once they've collided and they form a larger black hole, suddenly everything becomes rather simple and acts just like a wine glass that's been excited with a fork and then it actually decays down and settles into its final state. Therefore a black hole that happens extremely fast because they want to settle down as quickly as they possibly can, if you like. So the notes that we give off are milliseconds long rather than seconds long.

But the frequencies in the damping times of those notes are measurable with their picture waves. And by looking at them and comparing them to each other, we can check to see that the predictions of the theory are indeed what we would see in the world. So far, they seem to be the case. Yeah. Something I'm wondering is that these collisions that you're talking about, they are happening millions of light -wears away.

How are we even able to study them and also maybe tell us what we already have learned from them? They're really quite rare events, the types of collisions that we're seeing. I mean, this is why there are millions, hundreds of millions or even billions of light years away is because they're so rare in the universe that we need to look out a very long way before we see one often enough to make the detections often.

So they do happen in local galaxies as well as the reason to think it wouldn't, but it's just they're so rare. I've seen one near black source. Yeah. Yes, they are remarkably energetic. The amount of energy that is output as gravitational waves when you've got, say, two black holes coming together is phenomenal.

For just that moment as they smash in together, more energy, so the luminosity, the amount of energy per unit time emitted right at that peak is higher in gravitational waves than if you were to add up. or the visible light from all the stars that you could see in the universe. So it's a phenomenal amount of energy just over a very short way. So yeah, we just need to be listening to the universe to see these, to discover these sources and find out what they're trying to tell us.

The energy flux from these black hole collisions, despite the fact that they're hundreds of millions of light years away, is actually comparable to the flux from the full moon. So the brightest object in the night sky, is surpassed by gravitational wave signals, except we can't see the gravitational waves because they don't interact very strongly with matter. And it's only by building these incredibly sensitive detectors to pick up their effect on distances that we can still look at.

Yeah, that's just fascinating to me that we're even able to see... like hear these waves in a way. So, yeah, just to finally point home, there's so much energy that you're carrying away, but the effect is so tiny, as Chris said, 10 to the minus 21, no less. Yeah. If you think about how those two things could be true at once, it's telling you that it takes an enormous amount of energy to produce a tiny distortion in space. So it's very, very difficult to walk space.

And that's... the consequences of general malpractice. Yeah. And then, so I think now it's a good time for you to tell us. So maybe Christopher, you can tell us that. How do you use patient stats to extract as much information as possible from these tiny wave signals? How is base useful in this field? And how do you also actually do it? Are you able to use any... widespread open source packages or do you have to write everything yourself? How does that work concretely?

Yes, so for the type of sources we've been seeing these binaries, we have predictions for what the signal should look like. So we have a template that is a function of the parameters and we have a decent understanding of the properties of our noise. So the data is a combination of the signal plus some noise which you assume to be stationary over the short time scales that we're analyzing and characterized such that the noise at individual frequencies is uncorrelated.

So if you like, you get your data, transform it to the frequency domain, subtract out your template, you should be just left with noise, which is Gaussian at each frequency bin. And so you have a lot of Gaussian probabilities that you combine to get. So that gives us our likelihood. You map that out, you change your parameters for your template.

evaluate that at another point in parameter space, map that out with your suitable prior, and you end up with your posterior probability for a single event. The number of parameters that we're typically dealing with is something like 15 for typical binary. Maybe that goes up to 17 when we add in a couple of extra ones, a few more if we're maybe looking at tests of general relativity. So it's enough that exploring the parameter space can't be just done by gridding it up and exploring it.

We generally use some kind of stochastic sampling algorithm. But it's not one of these problems, at least yet, where we've got millions of parameters and it's a really high parameter space. In terms of the algorithms that we use to explore parameter space, we've got a long history of using MCMC and nested sampling for these. And John's really the expert on this. So, John, do want to say some more about? We'll get to that, yeah. Oh, you are done? OK, perfect.

So yeah, John, maybe if you can tell us. Yeah, maybe let's start with nested sampling that you use a lot for your inferences. So can you talk about that, why that's useful, and also why you end up using that a lot in your work? Which problem does that solve? So nested sampling is an alternative to MCMC. I don't know if you're... listeners will all have encountered it before. If you're a regular user of MCMC though, it's definitely worth a look. It was invented in 2006 -2007 by John Scaling.

He was a physicist. The idea is that you're actually trying to evaluate the evidence, the normalization constant of the posterior to allow you to do model selection in a basic way. But as a by -product, it can generate samples from the Bistidia as well. So this popped up around about the time that I started a full stock position in Birmingham and thought, well, why don't we give it a go and apply it to the problem of compact binaries.

So at that point, there was no off -the -shelf package available to do this. And so we had to create our own. That was all coded up in C for so time. It wasn't such a big thing. There was thousands of lines of code and all that. But yeah, so the reason is that you might prepare it for MCMC. People were trying to solve the same problem with parallel tempered MCMC. The compact binary parameter space has a fair amount of degeneracies, multiple data, and in amongst those modes.

They make it difficult to sample the waveforms are facilitated in a nonlinear problem. It can be quite complicated. Getting a decent exploration of the prior was proving to be difficult for the MCMC. Hence the need for parallel tempering. And this is something that works a little bit differently because it starts off by sampling the whole prior in the first place. So you know, say thousands of points, they're called live points.

scatter them across the entire prior and then compute the likelihood for every one of those. If you then eliminate the point that has the lowest likelihood and replace it with one that has the higher likelihood of the lowest one, then people still have a thousand points, so they will all have a likelihood higher than the worst one. And you can see that, roughly speaking, the volume of that remaining set of points will be about 999 thousandths of the original one just by random large numbers.

And so if you repeat that process, always replacing the point of the next iteration, you'll have 999 thousandths of 999 thousandths of the original. And so eventually you'll shrink in a geometric fashion the volume that your points are contained within. And... In doing so, you're walking uphill, you're moving towards the peak of the posterior. So, what I have seen to see it is guaranteed to terminate once you have held the climb up, which was a nice feature.

And it gives you the evidence for doing multi -selection. Once you've done the entire chain, you can resample those points from the chain and weight them according to the posterior to produce either independent samples or weighted. posterior samples are to meet. Yeah, so it's a really effective algorithm. I like it because it's reliable. And as I say, your run is guaranteed to finish. It might take a long time, but it will get there. There are of course, places where it falls down.

If you don't have an upline, you can end up missing a mode. The challenge is really how do you explore that constrained prior distribution. And so over the years, there have been different approaches to doing that. The one that I started, I was coding in Oracle Struct, was using MCMC inside the nested sampling. So just do a little MCMC chain to draw the next sample, which works fine, especially because we already knew how to do MCMC's for this problem quite well.

But other people have invented the ellipsoidal multiness algorithm, was one of the first very popular. off -the -shelf solutions and that was used also for gravitational waves. These days there are more modern, I packages that do everything you need, either with MCMC or with side sampling or more complicated things like normalizing flows. I should mention the Bowman or most of the gravitational wave using dynasty, which is next to sampling. myself and the students, there's no force with it.

That's the image that connects it something with artificial intelligence that attempts to use some machine learning to accelerate this whole process. Well, that sounds like fun. Yeah, I'm definitely going to link to Genesty. So the package you're using right now to do the NST sampling in the show notes and If you have anything you can share on this new package you're working on, for sure, please add that to the channel. These listeners will be very interested.

And maybe you want to add a bit more about this project. So how would you use machine learning in this way to help you do the nested sampling? Yeah, I can say something about that. It's a cool idea. I mean, the... Enabling technology for this is a tool called the normalizing flow. And I don't know if you've talked about it in podcast before, but they have a way of approximating complicated distributions using single ones with a remapping of the coordinate system.

So in that context, we were trying to make a good fit to the jump proposal for the sample, if you like, because that has to evolve. with the scale of the problem as the nested sample proceeds. The mode shrinks and it can shrink by a factor of 10 to the 20 over the course of the run. So you're going to need something adaptive to continue to have good efficiency. So we took this normalizing flow technique and applied it to this problem of fitting the existing samples.

And then the advantage being that it allows you to draw independent samples, a bit like the ellipsoidal. technique, but it doesn't require a fixed shape. So it's able to make more complicated shapes for distribution. Yeah, I'll pop the link in and people are very welcome to give a go. Yeah, for sure. Yeah. So folks give it a go, try it. If you see issues, report them on the GitHub, even better. If you can do a PR, I'm sure John will appreciate it.

And actually, so that could not be better because I will refer people to episode 98 of the podcast where I talked with Maridu Gabriel, who is one of the persons developing these kinds of methods. And we talked exactly about that. Adaptive MCMC augmented with normalizing flows.

And we... talked in the episode about how it offers a powerful approach, especially for sampling multimodal distributions, how it also scales the algorithm to higher dimensions, how you can handle discrete parameters, and how all these ongoing challenges in the field. So if you're interested in the nitty gritty details of what John just mentioned, I recommend listening to episode 98 because, well, Marilou is really a f***. One of the persons developing all that stuff.

Sounds super interesting Alex. I'm amazed at the power of some of these new techniques. There's a revolution going on at the moment in this area. It's a good time to be involved. Yeah, I know for sure. I will link to that. Also, Colin Carroll, who is one of the PIMC developers, he also has a new package, well, working on a new package called Biox. And I know that they implemented these normalizing flow algorithm.

And so now you can use that in PyMC directly through BIOX and to use that kind of algorithm and handle your multi -dimensional, multi -model distributions more easily. So I also link to that because it's definitely super interesting if you have lots of weird distributions. Like that. And Christopher, to come back to you, you also mentioned that you guys do population inferences.

And that's hierarchical models where you use a bunch of observations to infer the underlying distribution of the sources of the signal, if I understood correctly. So what does that look like? What do you guys do here? Yeah, so we do the calculation in a couple of stages that we always run the parameter estimation to get the events parameters for just one signal of time first. And so the result of that is a set of posterior samples calculated with a fiducial prior.

And what we want to do is then divide out that prior, put in a population model, see how well that fits. So calculate the, I guess, the evidence. under the assumption of a particular set of hyperparameters. And then we have an inference one level up where we vary the population parameters, the hyperparameters for the population model, explore that to see what fits work. So that really is starting to get the astrophysics.

So looking at the distribution of masses, are there more low mass black holes and high mass black holes? How does that scale? Is there a little? bumps in the distribution and things like that. So, yeah, it's next level up. The likelihood isn't quite as expensive as evaluating the waveforms, but we have some data handling issues of reading in order of the posterior samples.

And key to this is, as I alluded to, is correcting for the selection effects so that we need to account for the fact that with our gravitational wave detectors, we can preferentially see some sources over other sources.

So if you were just to look at our distribution of sources that we detect, you'll see, hey, there are lots of 30 solar mass black holes, there aren't too many 10 solar mass black holes, and if you didn't know about our selection effects, you can actually assume, okay, the universe is full of 30 solar mass black holes, and 10 solar mass ones are much rarer.

Whereas because our detectors are more sensitive to the high mass signals, those are intrinsically louder, so we can see them further away, we can see more of them. Once you correct for the selection effects, you actually see it's the other way around, there are many more At least there should be many more 10 solar mass black holes than 30 solar mass black holes.

And the fact we don't see so many 50 solar mass black holes, 90 solar mass black holes, tells you that the distribution does drop off quite rapidly. So this is a field that's growing quite nicely as we get more and more detections. Your uncertainties on the population basically go as the square root of the number of detections. what we're seeing a lot of work on is what does one assume for the population model.

So when we started off with, I guess, following what is common in astronomy, we put a power law through for the masses, just infer the power law index basically in the normalization for the overall rate and see how that worked. Then we like that's a bit simplistic. Let's add in a couple more parameters. Let's have say a little peak, a Gaussian add on top of that to get peak. Let's say have two parallels with the break, see how those fit. Let's put in another peak.

And now people are looking at semi -parametric models. So OK, what if we add a spline on top of that? See how we can vary that. Or what if we do something really flexible, so allow a bunch of kernels to come together and further the population to get out of there? So a lot of. A lot of the work at the moment is trying to see what is a good fit for the data and then checking is this overly complex? Are we overfitting? Is there a little bump there?

Is that just because of a pass on fluctuation that we've only seen so many events? So a small number of statistics means there's a few more here and a few fewer there. Or is there actually some feature of the underlying population, which may be a hint to how stars are formed? I think it's quite an interesting time at the moment from this testing out models, trying to determine do they fit the observations quite well.

And I'm very excited for getting the results of our upcoming observing runs when we're having a much larger number of detections and we'll really be able to constrain the models to higher accuracy and precision. Yeah, so that's super interesting. And so here to understand what you're doing, it's like your... hearing different sounds and you're trying to infer not really what the sound is about, but what is emitting that sound? What is the source of that sound?

And the issue is that these sounds can be emitted by a lot of entities and a lot of these sources you don't really care about because I know they are on earth, they are like, but what you're interested in are the sources. outside, which are in space and which tell you something about the universe, which here would be mainly neutron stars and black holes colliding. How weird was that characterization?

Yes, I guess maybe a nice analogy might be, imagine you have a room full of people and you're trying to judge the composition of the room. And some of the people there, you have a bunch of librarians who are very quiet. And you have some heavy metal stars who are very, very loud. And so you've made your recording of the audio in the room, and then you need to try and reconstruct that. OK, I can only hear one librarian.

But given that the librarians are very quiet, there's probably a whole host of other librarians who I just missed because they're being too quiet. and I can hear lots of electric guitars going on, so I know there's some rock stars here, but I know they're very loud and easy. I probably will have detected 100 % of those, so correct for those bias from the detection. We're very fortunate actually in gravitational wave detection that we can calculate our selection effects.

It's quite easy for us to determine what sources we can detect and what we can't. This is a standing problem in astronomy that you're We only have one universe, so we need to make sure we understand what we're seeing. And you can know what you detect, but it's very hard to know what you're not detecting. So a lot of astronomy is trying to correct for these.

And if you have a telescope, that can be very difficult because you've got to calculate, OK, not just what did I see, but what could I have seen? So that would depend on where I was pointing the telescope. It would depend on the weather on a particular day and how cloudy it was. Whereas with our gravitational wave, it's much simpler. What we do is we can inject the terminology we use.

We simulate signals, put those into our data, run our detection pipelines on that, and see what fraction of the signals that we injected would we recovered and from that work out. As a function of source parameters, what was the probability that something was detected? And then use that in renormalizing our likelihood to establish. Okay, how many of these sources should have there have been given that we saw this money?

Okay, it helps a lot that gravitational waves are not blocked by anything in the universe that we know about except for other black holes But even then other black holes tend to be very small So when we are able to calculate exactly what the source is doing it means that we've got a very good idea of what we will see. It doesn't really matter what's in the entropy space.

The two veins of astronomy are dust and magnetic fields, and gravitational waves are just don't really care about any of those two things. Yeah, okay. I see. And that's actually a good thing. Indeed, that's quite a luxury to be able to compute your own selection bias. That's pretty amazing. Me, who've done a lot of political science, you usually cannot do that, so I'm very jealous. And can you tell us actually where does that noise come from?

Because it seems like you're saying there is a lot of noise in your observations. Thankfully, you are able to tame that somewhat easily. Can you tell us a bit more about that? And John, it seems like you want to add something about that. Most of the noise, all the noise is not of extraterrestrial origin. It's coming from the detectors and coming from the environment around the detectors. So in order to understand that you have to know a little bit about how to light over a porp.

So imagine a giant in all shape, four kilometres long, in bits of light, with the letters at the ends of the arms shining a laser into the coin, if like. It gets split into two and sent down both arms, bounces off them into the end and then comes down. and if they aren't the same length then the light will constructively interfere or destructively, I may have that wrongly written.

The point is if they aren't at different lengths or if they're changing lengths then the pattern of the light that comes out will change over time. So we are really worried about anything that can change that output of the laser in the detector. And so that could be due to the laser itself. All lasers have some noise in them. So the lasers that they use in these detectors are some of the most stable lasers that you can use. have been invented from scratch basically for this one.

It could be the thermal motion of the atoms in the matrix of the complex. It would be better in that, simply having a wide enough laser beam approaching the whole surface of the metal, cancelling out the mean motion to the low enough level to get it ready. But the laser also You know, there's energy and that energy fishes on the mirrors of radiation, which causes the mirrors to move a little bit.

And now, think about the algorithms, the laser energy is carried by photons, which are ultimately quantum objects, so they get off the radar distinctly. Kind of raindrops on the roof, if you imagine, or if you're in a tent, you get raindrops of rain. That's kind of what it's like. The lasers are enormously hard. still they are made of individual photons. And so there's a shot noise associated with them, just due to the statistical fluctuation in the number of photons that are writing per second.

Then we've got the environment as well, which is especially dominant at low frequencies. So we can't sense anything below about 10 Hertz with these detectors that are above the ground. because of seismic motion. Now we do have a lot of techniques to try and screen the mirrors out in the motion of the Earth. They're hung on suspended optics, which act as a natural filter to prevent ground motion from propagating through to the mirror.

But even so, we need to have active oscillation systems as well. And on top of all of that, even if you manage to screen out all the mechanical coupling, There's unfortunately the gravitational coupling that we can't spin out because we actually want to measure gravity in the first place. So if you imagine a seismic wave as a pressure wave in the rock, I mean, when pressure is high, the rock is actually compressed slightly. And because it's compressed, it's denser than average.

And because it's denser than average, it exerts a gravitational pull on the mirrors that tends to pull them along. with the seismic waves. So this tiny effect, I mean, you've probably never even thought about it, but it's there as a small gravitational coupling of seismic waves to the detector. And you can't really get around these things tall on the earth.

And so that's why one of the challenges that we're working on at the moment is looking at sending a detector into space, which is hopefully going to open up a whole new range of... objects for us to look at. Yeah, thanks a lot, That's definitely clear, and I didn't have, indeed, any idea of all these sources of noise, which is pretty incredible that we're able to filter that out, knowing that already the signals you're looking at are already so weak.

So it feels pretty incredible to still be able to do it, even though the signals are weak. and the result of noise. It's really amazing the technology that is required to do these experiments has been developed decades and decades for people to develop it and almost all aspects of the detectors have to be invented for that purpose.

There's very little off -the -shelf technology and of course the spinoffs from that then taken up in other areas but it's the pure science that was driving the development of the law. Yeah, exactly. It's like, it's not even as if the all the engineering of these was already available and you could just go on Amazon and buy it, right? You have like everything has to be developed custom for these and you don't even know if that's going to work before you actually try it out.

So that's like all these endeavors are absolutely incredible. And so that makes me think and I think on these Christopher, you will have stuff to add. Because, so if I understood correctly, all these detectors that we have right now are on Earth. These gravitational waves detectors. Hopefully, we'll be able to do a video documentary on Learned Bay stats in one of these detectors. It's just some of the backstage I'm telling to the listeners. We'll see if that's possible.

But, so these detectors are on Earth. If you go to space and were able to put one of these detectors around the earth or I don't know, in space floating somewhere, I'm guessing that solves these problems, even though there are other sources of issues if you do that in space. But if I understood correctly, the LISA mission is space -based. And so is that a way of doing that? Can you tell us a bit more about that? Christopher and... Yeah, mainly tell us what the discoveries will be with that.

Also the data analysis problems that will engender, especially when it comes to the size of the data, I'm guessing. Yeah. So Lisa's Space Space Gravitational Wave mission, it's led by the European Space Agency with NASA as a junior partner there.

And the idea is we... launch a constellation of satellites, so three satellites that will orbit around the Sun lagging behind the Earth in a triangular formation and we bounce the lasers between them to make the same sort of measurements that we do for gravitational waves but over a much larger scale, so really massive arms. So this is great because we can avoid the ground -based noise that John mentioned and this is really good.

So for Lisa, we're not trying to see exactly the same sources as with our ground -based detectors, but we're trying to look for lower frequencies. So one of the things we've learned in astronomy over the last century or so is that each time you're observing the universe in a new way, you discover new things. So we want to look at a different part of the spectrum of gravitational waves. So Lisa's most sensitive is the millihertz range, so much lower frequencies.

And a much lower frequency gravitational wave, corresponds to a bigger source. So these could be the same type of binary, but just much further apart in that orbit, so much earlier before they come in and merge much further apart. Or we could be looking at much more massive objects, so massive black holes. We believe at the center of every galaxy is a massive black hole. Our own galaxy has one about four million solar masses, four million times the mass of our sun.

And we think galaxies merge, and so the massive black hole should merge. And so we'd be able to see these out to a much greater distance. So Lisa's objective is to see what we can observe in the gravitational wave sky at these much lower frequencies. And there's a whole host of different sources. So these massive black hole mergers we should be able to see out across the entire history of the universe. We should be able to see regular stellar mass black holes. So black holes formed from.

stars at the end of their lives spiraling into these supermassive black holes. It's a topic I've studied quite a lot. Those signals are extremely complicated. The orbits they undergo are very intricate, which is great if we observe one because we can measure the parameters to tiny, tiny precision, to one part in a million, something like that. But it's a huge pain from a data analysis point of view because you've got to find the part of parameter space where this is.

And we're also going to see huge numbers of binaries in our own galaxy of white dwarfs, maybe neutron -style white dwarfs, so the wide binaries here. And so the real data analysis problem for LISA will be how to fit all of this information all at once, because with our ground -based detectors, at least at the moment, we basically just see here's a signal and then here's another signal. So you can analyze each signal in isolation. With Lisa, you cannot you see everything all at time.

Some of these lights, they don't supermassive black hole mergers might be quite short to compare to place a localized in time, but they will still be overlapping these long lives. So the the in spiraling objects or the very wide bindings will basically be there for the entire mission or a large fraction of the mission. So to analyze the data, you need to fit everything or this is what we call a global fit problem.

And you So you potentially have hundreds of thousands of sources, each with a dozen parameters or so, maybe less than simpler sources. But you've got to do all of these all at the same time. And it potentially does matter how you do this, because things like the massive black hole binaries are extremely loud, so signal -to -noise ratios of thousands.

So if you get that wrong by just a little percentage, residual power in your data stream would be enough to bias your measurements of the quieter signals underneath. So this is a huge, I think possibly the most complicated data analysis problem in astronomy and we're just starting to figure out how we're going to tackle this.

So yeah, space -based detectors I think extremely exciting, a whole host of new sources that we can see, a new host of astrophysics that we can unlock through these observations, but also some extremely complicated data analysis challenges that need to be tackled and solved before the mission launches in the 2030s. And what's the timeline on this mission? Are we close to launch? Where are things right now? So just in the last couple of months, the mission was approved by ESA.

So that's them looking at the designs and going, OK, we think we can build this. And now the serious work on putting it together comes. So it's due to launch in the 2030s, exactly when that be, I'm sure. People are very confident on when it will be, but we know space -based missions are hard. So it might, maybe, maybe it's a little early to say exactly what date it will launch. But it will go up and then there'll be a little period of commissioning and then it will start observing.

So in the late 2030s, we should hopefully get the observations from that. So the current timeline, 2035 for launch, which I guess is... Good news to any of your listeners who are inspired by the problems that we're talking about and think this is really cool and think that maybe they'd like to tackle these problems. There's certainly enough time to go out, get a degree, start a PhD in the field before we get the real data. Yeah, for sure. Exactly.

And also, historically, these kind of huge missions tend to take a bit of delay. So, you know, like... You can start your PhD on this. I mean, that's better to launch later than to launch on time, but have a mission that fails, right? Yes. We're talking a billion euro cost of these things. So you definitely don't want to explode on the launch pad. Exactly. Way better to take a few more months and do some double checks than just launch because we said we would launch on that arbitrary date.

Yeah, the space agencies do take these things. It's been fascinating seeing the order, the things that needed to be rubber stamped to get the approval for the mission. So very good work people getting that done. So there are also other proposed space -based missions, some potential ones in China. There's a potential follow -up mission, I guess, slightly in the future, maybe in Japan that's been proposed for a few years.

status of these, I guess, it's difficult getting the funding for these things. So I think it's an exciting time in the field. Hopefully we'll expand the range of gravitational waves we can detect and that'll be great. Yeah, yeah, for sure. And I mean, that must be... So I don't know how directly involved you are on these, Lounch, but I'm guessing that if you're still working on these when... the mission launches, I'm pretty sure the day of the launch, you will be pretty nervous and excited.

Have you already lived that actually, or would that be new to you? So I mean, the closest analogy would have been there was a technology mission to test some of the key components of Lisa called Lisa Pathfinder that went up a few years ago, an extremely successful mission. And so watching that from the sidelines, my PhD was on LISA. If this mission didn't work, then there'd be no LISA mission. So all my PhD work would be in vain.

But thankfully, it worked very well and worked better than what was hoped for, in fact. So that was great. And I guess that's a real testament to the experiment, as saying I was feeling worried because it was my PhD work. But there really people in the field who have spent their entire careers working on this technology, you know, multiple decades. So it's all.

Yeah, real testament to their determination, I guess, their vision going into a field right at the beginning before anything worked to look at these things. It's also honestly quite remarkable that we somehow managed to convince the funding agencies to fund these things for so long before there would be scientific returns. So, yeah, we're extremely grateful that they had the forethought and the patience to invest in something so long before it would give returns. Yeah, definitely.

Yeah, that must be absolutely fascinating. John, anything you want to add on that? I think Christopher is doing a great overview of WISA, which indeed will be an enormous challenge on the ground. There are also plans to take things forward into the 2030s and beyond. Currently, there are two major... detectors in the kind of scoping design stage. One is led by the Europeans called the Einstein Telescope and the other one is led by the US called Cosmic Explorer.

They're taking different approaches. They're both going by detectors. The challenge there is to lower the noise floor. So giving them a sort of order of magnitude improvement in the range that you can see things to, which translates to thousand -fold increase in the volume that you can see things to, more or less. At these kinds of distances, you do actually have to worry about the size of the universe, getting in the way of these calculations.

But yeah, these new experiments will require a new infrastructure. So they're also going to require a new batch of experiments from national, indeed, European land. best friend. A lot of the data analysis challenges for those are kind of similar to the ones that we're tackling with the current generation of ground -based detectors. But the major difference is that the signals would be much longer because the low frequency end is really the target for improvement.

I think that's the way that the binaries chop. I mean, okay, I told you that they sort of make this characteristic, whoop, type noise. Maybe you can find a sample. and pluck out my pale imitation. The lower in frequency you start, the longer the signal will be. That multiplies the amount of data that you have to analyze, which with a Bayesian problem can be a bit challenging.

If you're doing many millions of light -weighting evaluations, you don't want each light -weighting evaluation to be expensive. And also the signal -to -noise ratio will be huge. Least effects are 10 higher. So you will run into problems with our uncertainties on the nature of the sources. So the models that we have are very good theoretical models at the moment and they're good enough for the current generation of detectors, but they will break down once observations become good enough.

They will probably show the crops in theories, which I should say is probably not a fundamental part in the theory. I think most people probably would put their money on general relativity being correct. The problem is that there is a translation layer between general relativity and the types of temperament we can use it that requires approximations and shortcuts and models to be created.

So there's challenges with modeling and balance that are quite difficult to overcome and people are searching that as well at the moment. Yeah, fantastic. Thanks a lot, guys. That's really fantastic to have all these overviews of the missions. And actually, I'm wondering, so with all that work that you've been doing, all these studies that you've been talking about since we started recording, we've been able to study actually what we want to do, right?

So study the astrophysics of black holes and also some tests of general relativity, as you were saying, Christopher. Can you tell us about that and mainly what are the current frontiers on those fronts? What are we trying to learn with the current missions? That's a big question. So general relativity, I guess, we really want to find somewhere where it doesn't work.

So for the point of view of understanding gravity, there's this tension within physics that how do you reconcile general relativity with quantum theory? And that is rather tricky and the whole host of different theoretical frameworks to try and reconcile this. But we don't know for certain what the answer is. And finding some hint where general relativity breaks down would give a pointer in the right direction. Of course, finding a place where general relativity breaks down is very difficult.

The place where I think it makes sense to look most is the most extreme environment. So where is gravity strongest? Where is the spacetime most dynamical? Where do things change the quickest? So black hole mergers, I think, are really, and the gravitational wave signals, they admit, are the best place to look for that. So that's why we're looking there.

And what we'd really love to find is some deviation from general relativity that we could actually be certain is a deviation from general relativity and not just a noise artifact. So I think we're pursuing a whole host of different things to look for deviations there. On the astrophysics point of view, there's just so much we don't know about the progenitors of these sources. So how do we end up with black holes and neutron stars. So stars are pretty important in astronomy.

Exactly how they work is kind of complicated. So there's a lot of uncertainties in that. And I think it's really quite remarkable how rapidly the field has progressed. So back in 2015, before we made our first detection, it wasn't at all certain that we would find pairs of black holes orbiting each other and merging. We knew there would be neutron stars. But we didn't know they're black holes because we'd never seen them. They're really hard to see other than gravitational waves.

That's kind of why we built the gravitational wave detectors. But we hadn't seen any of them. So our first detection confirmed, yes, they exist. And they exist in sufficient numbers that we can actually detect them. And then the follow up was when we measured the masses, they were about 30 times the mass of our sun. We'd never seen black holes in that mass range before. We now know, yep, there's quite a few of them.

But whether you can form black holes that big, tells you something about the way that stars live, how much mass they lose through their lifetime. So that's a key uncertainty that we don't really understand about how stars evolve. So now, as we're building up statistics, really teasing out the details of the mass distribution, what is the biggest black hole that you can build? Currently, we know there are these black holes that form from stars collapsing.

And we know there are these massive stars, massive black holes, millions of solar masses. lightest ones, hundreds of thousands, tens of thousands. But we don't know, is there a continuous distribution of black holes in between? So are there hundreds of thousands of mass black holes? So that's one of the key things to figure out. Is there a key thing? Where do these big, really big, massive black holes come from? And how do stars evolve?

The details of all the different ways that you could end up with massive black holes that people theorized? Which ones are correct? In what ratio out there? And then I guess one... One additional key thing, we talked about black holes in nature gravity. We've talked about how you form black holes in neutron stars. But there's also what neutron stars are really made of. So neutron stars, from the name you might suggest, OK, they're made of very neutron -rich matter.

But actually, what happens inside the core of a neutron star, we get a whole host of different phase changes, really quite exotic matter going on that we can't hope to replicate in the lab here on Earth. So we really don't know. this behaves. If we did, that would be really informative for understanding the dynamics of the particles that make those.

So by making measurements of the neutron stars we observe, how much they stretch and squeeze, we can hopefully get some constraints on what neutron stars are made of, which would be an exciting frontier there. John? One thing that I think we can zoom out from looking at the individual black holes and neutron stars and Still with the theme of trying to understand gravity is on the other scale is cosmology, the very, very largest scales, how is the universe evolving over time?

Hopefully with the current generation and the next generation, we'll be able to do cosmology in a completely different way than what we have done up until now. By looking at the gravitational wave signal, so those... properties of those signals, the fact that we know exactly what they look like, their amplitude and how it would case with distance means that they can be used as an independent co -coxmology.

Now we've already done this with the prying intrastar signal and with black holes that we've seen up to now, relatively low numbers of sources such that the constraints that we're able to cook are not yet competitive with the best constraints that we can get from other techniques. But going forward, as the numbers improve, as the SNRs and the applies ratio improves, this is going to get better and better over time.

And so even if we don't see anything on the scale of the individual black holes, if this agrees with general relativity, it could still help us en masse to pin down what's going on with cosmology, where there are many things that we don't understand, including discrepancies in the existing constraints we have. No one. And I'm curious, among all of these burning issues, burning questions, if you could choose one that you're sure you're going to get the answer to before you die, what would it be?

I don't know how long I'm going to live, but the thing that really motivates me is trying to understand whether the black holes that we're seeing really are the things that you can write down with pencil and paper when you're teaching people general relativity, or are they more complicated than that in reality? I think if there was one problem I have to choose in this field, that would be the one that I found the most interesting. I think I'd really like to know the answer to that one as well.

I think that might be one of the most challenging to actually get the solution to. The best way to answer it might be to travel into a black hole. But then the question of whether you observe anything before you die becomes rather technical. Yeah. Certainly something not advised for your listeners to give that a go. Yeah, I am not sure it would end up like Matthew McConaughey in The What's the movie? You know that? Interstellar. So Kip Thorne is one of the founders of LIGO.

One of the recipients of the Nobel Prize for Gravitational Analytics. He's behind Interstellar. So he advised on a lot of it. Yeah, the bit at the end is not backed up by science. For sure. At least for now. They originally were going to have the wormhole thing that opens up in Interstellar. They're going to have that detected with gravitational waves at LIGO. Unfortunately, Christopher Nolan cut that bit. Oh, that's a shame. It wasn't in the film. Maybe that would be for Interstellar 2.

We don't know. So guys, thanks a lot. I've already taken a lot of your time. And I still have a good talk for you.

hours because this is really really fascinating but it's time to call it a show before that though as usual i'm gonna ask you the the two questions i ask every guest at the end of the show first one if you had unlimited time and resources which problem would you try to solve um who wants to start I think if you're really serious about the unlimited time resources, then the most pressing problem I think would be nothing to do with adaptation waves, but it's more to do with the climate breakdown.

So if you want an honest answer, that's my answer, is solve climate change. That's a very popular answer. Get nuclear fusion working. That would be very nice. In our field with infinite resources, I tackle the quantum theory of gravity and get the evidence for that. Would be nice. Yeah, definitely. That is a great answer. And I think also some people answered that. So you're in good company, Christophe.

And second question, if you could have dinner with any great scientific mind dead, alive or fictional, who would it be? Maybe Chris, the only answer for, uh, dead, I think for this podcast, and James would be my choice. You may know him if you're a Bayesian. Yeah. Um, I think he would be very good dinner company. Um, his textbook was one of the formative influences on me as a young Bayesian. Yeah. Yeah. Yeah, for sure. And, uh, there is a, there is a really great, uh, YouTube.

series playlist by Aubrey Clayton, who was here on episode 51. So Aubrey Clayton wrote a book called Bernoulli's Fallacy, The Crisis of Modern Science. Really interesting book. I'll link to the episode and also to his YouTube series where he goes through E .T. Jane's book, Probability Theory, I think it's called. which is a really great book, also really well written and already goes through its chapters and explain the different ideas and so on.

So that's also a very fun YouTube playlist if you want I'm definitely going to go and look that up. Awesome, yeah. I'll send that your way. And Christopher? One of my favorite books, yeah. I don't know, I think I might be somewhat boring and just go for Einstein for the... of both gravity, I think he'd like to know what we're up to. And also just to see what, you know, his thoughts were being about being such a public intellectual and what it was like being that would be being cool.

I could invite a guest might be interesting to get Newton along as well, and see what they think about gravity. But I think that would be quite awkward in a conversation, I get the feeling, not the socially the most interactive. Yeah, yeah.

Do you think Einstein would accept at that point the, like all the advances in, like all the ramifications of actually general relativity and so on and the crazy predictions that that was making and in the end, most of them, like for now, at least were true, but at the end of his career, he was not really accepting that. Do you think he would accept that now?

I think he would accept the general relativity and he would be delighted to find that we've seen some of the effects that he never thought he observed. And again, he himself knew the general relativity couldn't be the final answer to the correction of gravity. So he'd probably also be interested to know how we've seen any signs of it breaking down. And I think the stuff that motivated him towards the end of his career is probably... still what's motivating a lot of people.

Well, if you are invited to such a dinner, please let me know and I will gladly come. Awesome guys. I think it's time to call it a show. You've been wonderful. Thanks a lot for taking so much time. As usual, I will put resources and a link to your websites in the show notes for those who want to dig deeper. The show notes are huge for this episode, I can already warn listeners. So lots of things to look at. And well, thank you again, Chris and John for taking the time and being on this show.

Thank you very much. I may put in one thing that your listeners might like. They're interested in trying gravitational wave data analysis. Data are public. They can look up the Gravitational Wave Open Science Center, download the data there. Also, they'll find links to tutorials. There are workshops held fairly regularly that they can maybe sign up to to get some data analysis experience.

And there's a whole list of open source packages for gravitational wave data analysis linked from those so they can go and have a look at themselves. Yeah, this is indeed a very good ad. Thank you very much, Christopher. I actually already put these links in the show notes and forgot to mention them. So thank you very much. Because we're all very dedicated to open source and open source here.

So if any of the listeners are interested in that, like how these things are done, You have all the packages we've mentioned in the show notes, but also the open source and open science efforts from your collaborations, Christopher and John. So definitely take a look at the show notes. Everything is in there. Thank you guys. And well, you can come back on the podcast any... Any time, hopefully around 2034 to talk about Lysa and the space -based mission. I'll put it in my calendar.

Transcript source: Provided by creator in RSS feed: download file