Skip to content

Ask LISA’s Data Analysts

Questions about data analysis after the satellites have left Earth orbit? Some questions have already been asked and answered. Further on you can view the complete list of FAQ entries.

How much data will LISA generate and how will it get to the ground?

LISA’s data and telemetry requirements are relatively modest when compared to many other astrophysics missions. While the precise details are being developed as part of the mission formulaiton process, the rough numbers are known. During normal operations, only one of the three LISA spacecraft will be in contact with the ground. In addition to transmitting its own data, the spacecraft will serve as a relay for data from the other two spacecraft, which will share data over a dedicated inter-constellation link. This is efficient because the separation between spacecraft (2.5Mkm) is roughly 20x smaller than the distance to Earth (approximately 50Mkm). The required data rate to Earth is appoximately 150kbps, or about the speed of a good household modem in the late 1990s. Daily contact will be made with the constellation for a period of roughly 8 hours, resulting in a aggregate data rate of roughly 4GB/day. This will include the primary outputs from the science instrument, auxilliary channels used to monitor the science instrument, and general spacecraft housekeeping data for the full constellation. This data will be processed on ground to produce LISA’s basic measurement product, time-delay interferometry (TDI) variables, which contain the full set of gravitational wave signals in the LISA band as well as residual instrumental noise. The four fundamental TDI variables will be sampled with a rate of a few Hertz, resulting in a data rate of roughly 60MB/day. The TDI data will be used to generate further downstream products such as source catalogs, alerts, etc.

What is Time Delay Interferometry (TDI) and how does it work?

Interferometry is a technique that uses the interference of waves to make precise measurements. The wavelength of the interfering waves acts like the tick marks on a ruler for measuring distance. Optical interferometers can make very precise measurements because the wavelength of the light waves they use is small — around one micron for instruments like LIGO and LISA. A fundamental limitation of interferometry is that precision of the measurement is limited by the stability of the waves used in the interferometer. For an optical interferometer, if the wavelength of the light fluctuates, a spurious signal will be generated that mimics physical motion. One way to mitigate the effect of a fluctuating source is to compare pairs of distances using a common light source. This is the underlying concept of the Michelson interferometer that was used by Albert Michelson and Edward Morely to search for the “luminferous aether” in the late 19th century. LIGO uses the same concept in its interferometers over a century later. In order for this technique to work, the lengths of the light paths must be precisely matched. While LISA’s orbits produce approximately-equal arms, they differ by up to a percent and fluctuate by almost the same amount over long time periods due to orbital mechanics. Time Delay Interferometry (TDI) is a technique that was developed in the late 1990s and early 2000s to allow LISA to take advantage of the “common mode rejection” effect despite having unequal arms. TDI takes advantage of the fact that LISA measures the interference in each one-way laser link individually. While each of these signals is dominated by fluctuations in the LISA laser wavelength, those same fluctuations are measured at multiple points in the LISA constellation with varying time delays. By combining these individual measurements and correcting for the time delays, and adding in some rough knowledge of the constellation geometry, a significant amount of suppression of laser wavelength noise can be achieved. The ability to suppress laser wavelength noise through TDI is primarily determined by the precision of the individual interference measurements and the accuracy of the estimates of the LISA arm lengths. TDI has been extensively examined in analytic studies, numerical simulations, and experimental analogues and has been demonstrated to work as expected. The LISA team continues to refine our understanding of this important technique to ensure that it will provide the sensitivity that LISA requires to achieve its science goals.

How does LISA localize sources and how well will it do so?

LISA is an all-sky instrument, with the sensitivity to gravitational waves only weakly depending on the location of the source in the sky. Localization of individual sources comes from two main effects. The first is the motion of the LISA constellation around the Sun, which introduces shifts in both frequency (Doppler effect) and amplitude (sweeping the LISA sensitivity pattern across the sky). These shifts encode information about the sky position of the source in the waveform that LISA observes. Since most LISA sources are observed for months or years, there is sufficient modulation to provide localization. The second effect is that, for the higher frequency sources that LISA observes, the wavelength of the gravitational waves is similar to or smaller than the size of the LISA constellation. This means that different parts of the constellation experience the gravitational wave at slightly different times, which again encodes information about the location of the source. The precision of LISA’s localization of a particular source depends on many factors including the type of source, the particular parameters of the source, and the duration of the observation. For the best-localized sources, the final localizations may be on the order of a few arcminutes. Degree-scale localization will be more typical and the more numerous faint sources will be localized less well. Interestingly, LISA’s localization of a particular source will improve over time, which will open up some novel observing strategies for potential EM counterparts of events such as mergers of massive black holes.

How can we extract science from the data?

Key components of LISA´s data analysis are the ability of creating high-fidelity waveforms for gravitational wave sources, having a well-understood signal simulator for the mission, and being able to extract the source parameters from the simulated signals. In so-called Mock LISA Data Challenges (MLDC) scientists already demonstrated the feasibility of LISA data analysis.

How are the scientific products being delivered?

In strong interaction with the Data Analysis Groups of the LISA Consortium, the DPC will implement, execute and control the data analysis pipelines and deliver the scientific products (such as catalogues of identified gravitational waves) to the consortium. To do so, it’s main focus will be on developing tools to support: software development, test and validation; pipeline integration and deployment on computing infrastructures; data management, tracing and archiving; simulation activities.

What are LISA Data Challenges (LDC)?

LDCs are based on blind challenges of increasing complexity – from a few sources in the first challenge to the full combination of all likely sources in the data stream in the most recent fourth challenge. Scientific research groups from all over the world developed, tested and implemented a wide variety of techniques. As a result a proof-of-concept for LISA data analysis is strongly tested and ready to go.

What are data analysis pipelines?

A data pipeline is a set of actions that ingest raw data from disparate sources and move the data to a destination for storage and analysis. A pipeline also may include filtering and features that provide resiliency against failure. There are essentially three major types of pipelines along the transportation route: gathering systems, transmission systems, and distribution systems.