Pages Navigation Menu

SHOWFUN - Show & Fun & More!

New algorithm developed at MIT for imaging black holes

Everybody knows you can’t see a black hole. Nothing gets out, not even light. Except that, as with most conventional wisdom, isn’t the whole story. Leaving aside the can of worms labeled Hawking radiation, we still know that the matter falling into a black hole heats up as it falls in. In theory, we can pick that up with a good old radio telescope. But black holes are so far away that we need way better angular resolution than any telescope we currently have, if we want to confirm these predictions with actual data.

“A black hole is very, very far away and very compact,” says Katie Bouman, a grad student at MIT working with an international collaboration called the Event Horizon Telescope. “It’s equivalent to taking an image of a grapefruit on the moon, but with a radio telescope. To image something this small means that we would need a telescope with a 10,000-kilometer diameter, which is not practical, because the diameter of the Earth is not even 13,000 kilometers.” This is where interferometry comes in. Bouman developed a new imaging algorithm called CHIRP, for Continuous High-resolution Image Reconstruction using Patch priors, and it uses interferometry, essentially, “to turn the entire planet into a large radio telescope dish.”

The Event Horizon Telescope is actually an array of radio telescopes working to image Sagittarius A*, the black hole at our galaxy’s center. We can’t image Sagittarius A* with optical means, because there’s just too much debris in the way. But the EHT uses interferometry to combine and compare the input from multiple telescopes, a Nobel prizewinning technique which confers much better angular resolution. With the angular resolution afforded by a radio telescope the effective size of the planet, we could use interferometry to find out whether or not our galaxy’s supermassive black hole actually looks like we think it does.

CHIRP works a little like an insect eye, in that it combines sections of the EHT array’s visual field into a coherent whole. Part of the method uses algebra to multiply measurements from three telescopes together, which triangulates away noise generated by the interference of Earth’s atmosphere. Six telescopes have already signed on to participate in the collaboration, but it can accommodate every telescope on Earth: Using CHIRP, Bouman’s project can stitch together what all the radio telescopes see.

“Normal” interferometry uses an algorithm that treats an image from a radio telescope as a collection of individual points of different brightness on a plane. It tries to find the points whose brightness and location most closely match the data. Then the algorithm blurs together bright points near each other, to meld the astronomical images together. In the new model, instead of points on a 2D plane, there are cones whose heights give the total brightness at any spot — black, empty sky would be represented by a cone of zero height. This sharpens the image and filters out noise, using the same principles that make constructive and destructive interference work.

But the earth isn’t exactly peppered with interferometers. There are large areas on the ground that aren’t collecting any data. CHIRP fills in the gaps by mathematically stitching together different telescopes’ fields of view, wherever they overlap, to create a continuous whole. It’s like a brightness topo map of the sky; tall places are bright spots. “Translating the model into a visual image is like draping plastic wrap over it: The plastic will be pulled tight between nearby peaks, but it will slope down the sides of the cones adjacent to flat regions,” the team said in a statement. “The altitude of the plastic wrap corresponds to the brightness of the image.”

To verify CHIRP’s predictions, Bouman and team loosed machine learning on the imaging problem. They trained the learning algorithm on images of celestial bodies, earthly objects and black holes, and found that CHIRP frequently outperformed its predecessors. The report is freely available here (PDF). Since Bouman made her test data available online, other researchers can use and improve on it.  Bouman and team will present the details of CHIRP at the 2016 IEEE Computer Vision and Pattern Recognition conference in June.

Leave a Comment

Captcha image