(CN) – The unique conditions that enable the formation of supermassive black holes have puzzled astronomers since they were first discovered more than a decade ago. Now, an international team of researchers have added evidence to a theory of how these cosmic behemoths form.
In a study published Monday in the journal Nature Astronomy, the team details the results of computer simulations that show how radiation from nearby galaxies fueled the first supermassive black holes, which can grow to about a billion times the mass of our sun.
A black hole can quickly expand at the center of its host galaxy if a nearby galaxy provides enough radiation to suppress the home galaxy’s capacity to form stars. Once stars stop forming, the galaxy grows until it eventually collapses, resulting in a black hole that feasts on the remaining gas and – later – dust, dying stars and even other black holes, in order to become supermassive.
“The collapse of the galaxy and the formation of a million-solar-mass black hole takes 100,000 years – a blip in cosmic time,” said study co-author Zoltan Haiman, an astronomy professor at Columbia University. “A few hundred-million years later, it has grown into a billion-solar-mass supermassive black hole.
“This is much faster than we expected.”
The common theory is that supermassive black holes form over billions of years. However, astronomers have observed more than two dozen of these cosmic giants within 800 million years of the Big Bang, 13.8 billion years ago.
In the early universe, galaxies and stars formed as molecular hydrogen cooled and deflated a primordial plasma – a state of matter, like solids, liquids, and the like – of hydrogen and helium. This environment theoretically would have prevented black holes from reaching maximum size as molecular hydrogen turned gas into stars far enough away to escape the black holes’ gravitational pull.
However, astronomers have suggested several ways supermassive black holes might have overcome this barrier.
In a 2008 study, Haiman and his colleagues hypothesized that radiation from a neighboring, massive galaxy could split molecular hydrogen into atomic hydrogen and cause the young black hole and its host galaxy to collapse rather than spawn new clusters of stars.
A later study – led by Eli Visbal, then a postdoctoral researcher at Columbia – calculated that the nearby galaxy would have to be at least 100 million times more massive than our sun to emit enough radiation to stifle star formation.
The new study, led by John Regan, a postdoctoral researcher at Ireland’s Dublin City University, ran computer simulations that included the effects of fluid dynamics, chemistry, gravity and radiation.
After several days of crunching data on a supercomputer, the team found that the neighboring galaxy could be smaller and closer than previously expected.
“The nearby galaxy can’t be too close, or too far away, and like the Goldilocks principle, too hot or too cold,” said John Wise, study co-author and associate astrophysics professor at Georgia Tech.
While supermassive black holes are often found at the center of galaxies in the mature universe, including our own Milky Way, they are much less common in the infant universe. The earliest supermassive black holes were first observed in 2001 through a telescope at New Mexico’s Apache Point Observatory as part of the Sloan Digital Sky Survey.
The team hopes to test their theory when NASA’s James Webb Space Telescope, the successor to Hubble, goes online next year and captures images from the early universe.
Competing models of how these behemoths evolved – including one in which black holes grow by merging with millions of smaller black holes and stars – require additional testing, according to the researchers.
“Understanding how supermassive black holes form tell us how galaxies, including our own, form and evolve, and ultimately, tells us more about the universe in which we live,” Regan said.
Read the Top 8
Sign up for the Top 8, a roundup of the day's top stories delivered directly to your inbox Monday through Friday.