For the first time, scientists have built a digital version of the Milky Way that follows the motion of individual stars, not blurry clumps.
The project combines artificial intelligence with leading supercomputers to watch our galaxy change over thousands of years.
In plain terms, the team can now model more than one hundred billion stars as separate points in a living map of the galaxy.
That level of detail turns the Milky Way from a rough sketch into a detailed movie that researchers can pause and compare with telescopes.
Building the digital Milky Way
The work was led by Keiya Hirashima, an astrophysicist at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences.
His research uses high performance computing and artificial…
For the first time, scientists have built a digital version of the Milky Way that follows the motion of individual stars, not blurry clumps.
The project combines artificial intelligence with leading supercomputers to watch our galaxy change over thousands of years.
In plain terms, the team can now model more than one hundred billion stars as separate points in a living map of the galaxy.
That level of detail turns the Milky Way from a rough sketch into a detailed movie that researchers can pause and compare with telescopes.
Building the digital Milky Way
The work was led by Keiya Hirashima, an astrophysicist at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences.
His research uses high performance computing and artificial intelligence to study how galaxies grow, form stars, and recycle the elements that build planets.
Astronomers estimate that our galaxy contains roughly one hundred billion stars spread through a thin disk and a surrounding halo.
Matching that crowded structure in a computer means tracking every star, every cloud of gas, and the dark matter that pulls them together.
To handle this, the team used a N body simulation – a computer model that tracks many particles under gravity. They coupled it to a method that treats gas as moving particles instead of a rigid grid.
Galaxies involve events that span from a few light years to a disk about one hundred thousand light years wide. Slow changes such as the rotation of the disk take millions of years, while a supernova can reshape gas quickly.
Why every star matters
Older galaxy models took a shortcut by bundling many suns into single heavy particles instead of following each star.
That saved computing time but blurred out the life cycles of individual stars, especially the massive ones that stir their surroundings when they explode.
The new digital Milky Way simulation reaches star by star detail, so the model can tell whether a supernova blasts apart a cloud or leaves it intact. It also follows how heavy elements such as carbon and iron flow into later generations of stars.
Achieving that detail would normally force the code to crawl forward in very small time steps, tiny jumps that move every particle briefly.
Each explosion would demand many extra steps, so the whole run would spend most of its effort on a few crowded regions.
For a Milky Way-sized system, one code would need years of supercomputer time to span a billion years at this level of detail.
That barrier, sometimes called the billion particle limit, kept most simulations stuck with either coarse Milky Way models or detailed but much smaller galaxies.
Material circulation in a galaxy: Diffuse warm gas loses energy through radiation and conduction and form a disk like structure (galactic disk). When massive stars – roughly 10 times the mass of the Sun – reach the end of their lifetimes, they explode as supernovae. These explosions inject both energy and heavy elements into the surrounding interstellar gas and induce turbulence. A part of these materials is ejected as outflow and eventually fall back to the galactic disk, where forms the next generation stars. These enriched materials finally forms planets like the Earth and lives like us. Credit: NASA/JPL-Caltech, ESA, CSA, STScI. Click image to enlarge.
AI and the digital Milky Way
To escape this bottleneck, the researchers trained a deep learning model to stand in for the physics of a supernova blast.
This model is a kind of artificial intelligence that finds patterns in data and can quickly predict how gas behaves after an explosion.
They ran high resolution simulations of single explosions, giving views of gas around each blast down to particles with the mass of our Sun.
Those runs trained a surrogate model, a stand in for physics, which learns how cubes of gas hundreds of light years across will evolve.
During the digital Milky Way run, the main code steps forward on regular time steps and sends gas near each exploding star to the network.
The network predicts how that cube will look one hundred thousand years after the blast and returns the updated gas to the main simulation.
Because the main code never shrinks its global time step, the whole simulation runs more efficiently than a traditional approach that resolves every interval.
Tests showed that measures such as star formation rates and gas temperatures match results from full physics runs that do not use the surrogate.
Digital Milky Way computing
Making this scheme work still required high performance computing (HPC), large scale computing with many linked processors, on a scale that few facilities can match.
Japan’s Fugaku supercomputer, built from one hundred sixty thousand nodes based on the A64FX processor, provided most of the number crunching for the project.
In technical terms, the team ran a galaxy simulation with about three hundred billion particles on roughly seven million processor cores.
That scale breaks the old billion particle barrier that had limited galaxy models to either coarse Milky Way analogs or detailed but smaller systems.
Because the surrogate handles the hardest physics, the run can cover one million years of evolution in about two point eight hours.
Scaled up to a billion years, that performance would shorten the job from three decades to one hundred fifteen days on the same machines.
Compared with a conventional code, this approach delivers a speed up of more than one hundred times for a Milky Way system.
The gain comes not from skipping physics but from handling expensive pieces with a trained network instead of brute force updates.
Wider impact
This work shows how combined physics based models and machine learning can unlock systems that span ranges in size and time.
The same idea of using a trained network to replace the stiffest parts of a simulation is spreading across many areas of science.
Scientists have built climate surrogates that mimic parts of weather and climate models, cutting their cost while preserving patterns of rainfall and circulation.
These tools make it possible to run many experiments that explore different greenhouse futures or test how sensitive a region is to warming.
Hydrologists have built modeling frameworks that use surrogates to stand in for water cycle models, allowing studies of river basins, reservoirs, and flood risk.
A broader review of artificial intelligence in climate science points to benefits for tracking extremes such as heat waves, droughts, and wildfires.
The digital Milky Way project shows how this strategy lets scientists trace gas and stars and link them to what telescopes see today. “Integrating AI with high-performance computing marks a fundamental shift,” said Hirashima.
The study is published in the Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–