When a gravitational wave – sometimes poetically referred to as a “space-time ripple” – reaches Earth, every second counts.
Data-processing speed when detecting and analyzing a gravitational wave determines how much astronomers can learn from the waves of space-time that twirl across the universe at the speed of light.
In 2015, the Massachusetts Institute of Technology-based Laser Interferometer Gravitational-wave Observatory (LIGO), a multinational physics and cosmology research initiative, first detected gravitational waves from the collision of binary black holes. That landmark discovery verified the general theory of relativity that Albert Einstein established a century ago.
But back in 2015, it took scientists months to vet, validate and interpret the data, as LIGO detectors collected more than 16,000 data samples a second, and to confirm that a signal was generated by gravitational waves, scientists had to remove “noise” and compare the data patterns with theoretical templates of gravitational waves.
“In an era of multi-messenger astronomy, we have to shorten the time as much as possible so as to trigger the alert quickly enough for follow-up observations,” Cao Junwei, a Chinese scientist in the LIGO program, told Xinhua.
Last October, Cao and his LIGO colleagues made headlines for detecting for the first time a gravitational wave from the collision of binary neutron stars and corresponding electromagnetic signals, by leveraging the power of ultra-high data-processing speed.
A gamma-ray burst was detected by the US National Aeronautics and Space Administration’s Fermi space telescope in low Earth orbit, merely 1.7 seconds after the gravitational wave was detected and an alert was triggered.
With the call-out to the global the astronomical community, the LIGO team scrambled about 70 ground and space detectors into follow-up observations of electromagnetic signals to help locate the exact source of the gravitational wave.
But Cao does not claim to be an expert in astrophysics. Rather, he is a computer specialist whose expertise is instrumental in ratcheting up the number of floating-point operations per second of the racks of computers inside the LIGO laboratory.
Cao’s team also piggybacks on China’s indigenous high-throughput and multi-task computing technologies that are behind the nation’s fast supercomputers, such as the Sunway TaihuLight that currently retains the crown of the world’s fastest with a LINPACK benchmark rating of 93 petaflops. (A petaflop is unit of computing speed equal to one thousand million million floating-point operations per second.)
Cao joined the program as a computer scientist in 2004 before returning to China to lead a team from Tsinghua University’s Research Institute of Information Technology.
“We were the only Chinese group in the collaboration. None of us specialized in astrophysics, but we were still admitted,” he said.
Over the years the Tsinghua team helped ramp up computational capabilities for ultra-fast real-time data analysis. Their accomplishments include a set of machine-learning-based data-processing pipelines, developed in collaboration with the University of Western Australia in Perth.
The new pipelines help speed up data filtering to finish comparing data patterns with tens of thousands of templates within the blink of an eye.
“Now, the time between the arrival of a signal and the confirmation of it as gravitational waves has been shortened from several minutes to dozens of seconds. Next, we hope to shrink the time to three to five seconds,” said a Tsinghua press release.
Cao said: “With a major LIGO upgrade under way, as well as ultra-sensitive sensors, the number of signals that can be detected may soar from just a few a year to several a day. We will fall far behind if we can’t accelerate data processing.”