UW–Madison joins new NSF-Simons AI Institute for the Sky

This post is modified from the original news story from Northwestern University

A large multi-institutional collaboration— led by Northwestern University and including UW–Madison physics professors Keith Bechtol, Kyle Cranmer, and Moritz Münchmeyer — has received a $20 million grant to develop and apply new artificial intelligence (AI) tools to astrophysics research and deep space exploration.

Jointly funded by the National Science Foundation (NSF) and the Simons Foundation, the highly competitive grant will establish the NSF-Simons AI Institute for the Sky (SkAI, pronounced “sky”). SkAI is one of two National AI Research Institutes in Astronomy announced today. Northwestern astrophysicist Vicky Kalogera is principal investigator of the grant and will serve as the director of SkAI. Northwestern AI expert Aggelos Katsaggelos is a co-principal investigator of the grant.

The new institute will unite multidisciplinary researchers to develop innovative, trustworthy AI tools for astronomy, which will be used to pursue breakthrough discoveries by analyzing large astronomy datasets, transform physics-based simulations and more. With unprecedentedly large sky surveys poised to launch, including from the Vera C. Rubin Observatory in Chile, astronomers will require smarter, more efficient tools to accelerate the mining and interpretation of increasingly large datasets. SkAI will fulfill a crucial role in developing and refining these tools.

Read the full NU press release

Learn more about SkAI

Through machine learning maps, cosmic history comes into focus

By Jason Daley, UW–Madison College of Engineering

three images of low-res input data, high-res ground truth data, and super-resolution output data as heatmaps. A top left graph panell shows the power spectrum of the data
Using machine learning techniques, Kangwook Lee and his collaborators are able to produce high-resolution images from low-resolution simulations. These types of techniques could help improve large scale models, like the Illustris Simulation, shown here. In this simulation, dark matter density is overlaid with the gas velocity field. Credit: Illustris Collaboration

For millennia, humans have used optical telescopes, radio telescopes and space telescopes to get a better view of the heavens.

Today, however, one of the most powerful tools for understanding the cosmos is the computer chip: Cosmologists rely on processing power to analyze astronomical data and create detailed simulations of cosmic evolution, galaxy formation and other far-out phenomena. These powerful simulations are starting to answer fundamental questions of how the universe began, what it is made of and where it’s likely headed.

“It is extremely expensive to run these simulations and basically takes forever,” says Kangwook Lee, an assistant professor of electrical and computer engineering at the University of Wisconsin-Madison. “So they cannot run them for large-scale simulations or for high-resolution at that same time. There are a lot of issues coming from that.”

Instead, machine learning expert Lee and physics colleagues Moritz Münchmeyer and Gary Shiu are using emerging artificial intelligence techniques to speed up the process and get a clearer view of the cosmos.

Read the full story

Machine Learning meets Physics

Machine learning and artificial intelligence are certainly not new to physics research — physicists have been using and improving these techniques for several decades.

In the last few years, though, machine learning has been having a bit of an explosion in physics, which makes it a perfect topic on which to collaborate within the department, the university, and even across the world. 

“In the last five years in my field, cosmology, if you look at how many papers are posted, it went from practically zero to one per day or so,” says assistant professor Moritz Münchmeyer. “It’s a very, very active field, but it’s still in an early stage: There are almost no success stories of using machine learning on real data in cosmology.”

Münchmeyer, who joined the department in January, arrived at a good time. Professor Gary Shiu was a driving force in starting the virtual seminar series “Physics ML” early in the pandemic, which now has thousands of people on the mailing list and hundreds attending the weekly or bi-weekly seminars by Zoom. As it turned out, physicists across fields were eager to apply their methods to the study of machine learning techniques. 

“So it was natural in the physics department to organize the people who work on machine learning and bring them together to exchange ideas, to learn from each other, and to get inspired,” Münchmeyer says. “Gary and I decided to start an initiative here to more efficiently focus department activities in machine learning.”

Currently, that initiative includes Münchmeyer, Shiu, Tulika Bose, Sridhara Dasu, Matthew Herndon, and Pupa Gilbert, and their research group members. They watch the Physics ML seminar together, then discuss it afterwards. On weeks that the virtual seminar is not scheduled, the group hosts a local speaker — from physics or elsewhere on campus — who is doing work in the realm of machine learning. 

In the next few years, the Machine Learning group in physics looks to build on the momentum the field currently has. For example, they hope to secure funding to hire postdoctoral fellows who can work within a group or across groups in the department. Also, the hiring of Kyle Cranmer — one of the best-known researchers in machine learning for physics — as Director of the American Family Data Science Institute and as a physics faculty member, will immediately connect machine learning activities in this department with those in computer sciences, statistics, and the Information School, as well other areas on campus.

“There are many people [on campus] actively working on machine learning for the physical sciences, but there was not a lot of communication so far, and we are trying to change that,” Münchmeyer says.

Machine Learning Initiatives in the Department (so far!)

Kevin Black, Tulika Bose, Sridhara Dasu, Matthew Herndon and the CMS collaboration at CERN use machine learning techniques to improve the sensitivity of new physics searches and increase the accuracy of measurements.

Pupa Gilbert uses machine learning to understand patterns in nanocrystal orientations (detected with her synchrotron methods) and fracture mechanics (detected at the atomic scale with molecular dynamics methods developed by her collaborator at MIT).

Moritz Münchmeyer develops machine learning techniques to extract information about fundamental physics from the massive amount of complicated data of current and upcoming cosmological surveys. 

Gary Shiu develops data science methods to tackle computationally complex systems in cosmology, string theory, particle physics, and statistical mechanics. His work suggests that Topological Data Analysis (TDA) can be integrated into machine learning approaches to make AI interpretable — a necessity for learning physical laws from complex, high dimensional data.