For this award, postdocs write a short article that describes their research for the general public. Interested postdocs apply to submit an article, then receive guidelines and editorial assistance to improve accessibility. Approved articles are posted on the WISL website, and the author receives a $250 award.
Islam’s article, “Navigating Anomalies in the Pursuit of New Physics” shares how particle physicists study the fundamental building blocks of the universe, exploring the complex behaviors of subatomic particles that shape our world.
“As a postdoctoral researcher, my journey is not limited to research alone,” Islam says in the article. “It comes with an important responsibility to share the wonders and experiences of scientific research with diverse communities.”
The dual mission of the Wisconsin Initiative for Science Literacy is to promote literacy in science, mathematics and technology among the general public and to attract future generations to careers in research, teaching and public service. WISL is directed by Professor Bassam Z. Shakhashiri of the University of Wisconsin-Madison Chemistry Department. Programs draw on the concepts developed by Dr. Shakhashiri during many years of innovative work in science education.
Department of Energy grant to train students at the interface of high energy physics and computer science
Posted on
To truly understand our physical world, scientists look to the very small, subatomic particles that make up everything. Particle physics generally falls under the discipline of high energy physics (HEP), where higher and higher energy collisions — tens of teraelectronvolts, or about ten trillion times the energy of visible light — lead to the detection and characterization of particles and how they interact.
These collisions also lead to the accumulation of inordinate amounts of data, and HEP is increasingly becoming a field where researchers must be experts in both particle physics and advanced computing technologies. HEP graduate students, however, rarely enter graduate school with backgrounds in both fields.
Physicists from UW–Madison, Princeton University, and the University of Massachusetts-Amherst are looking to address the science goals of the HEP experiments by training the next generation of software and computing experts with a 5-year, ~$4 million grant from the U.S. Department of Energy (DOE) Office of Science, known as Training to Advance Computational High Energy Physics in the Exascale Era, or TAC-HEP.
“The exascale era is upon us in HEP and the complexity, computational needs and data volumes of current and future HEP experiments will increase dramatically over the next few years. A paradigm shift in software and computing is needed to tackle the data onslaught,” says Tulika Bose, a physics professor at UW–Madison and TAC-HEP principal investigator. “TAC-HEP will help train a new generation of software and computing experts who can take on this challenge head-on and help maximize the physics reach of the experiments.”
In total, DOE announced $10 million in funding today for three projects providing classroom training and research opportunities in computational high energy physics to train the next generation of computational scientists and engineers needed to deliver scientific discoveries.
At UW–Madison, TAC-HEP will annually fund four-to-six two-year training positions for graduate students working on a computational HEP research project with Bose or physics professors Keith Bechtol, Kevin Black, Kyle Cranmer, Sridhara Dasu, or Brian Rebel. Their research must broadly fit into the categories of high-performance software and algorithms, collaborative software infrastructure, or hardware-software co-design.
Bose’s research group, for example, focuses on proton-proton collisions in the Compact Muon Solenoid (CMS) at the CERN Large Hadron Collider (LHC). The high luminosity run of the LHC, starting in 2029, will bring unprecedented physics opportunities — and computing challenges, challenges that TAC-HEP graduate students will tackle firsthand.
“The annual data volume will increase by 30 times while the event reconstruction time will increase by nearly 25 times, requiring modernization of the software and computing infrastructure to handle the demands of the experiments,” Bose says. “Novel algorithms using modern hardware and accelerators, such as Graphics Processing Units, or GPUs, will need to be exploited together with a transformation of the data analysis process.”
TAC-HEP will incorporate targeted coursework and specialized training modules that will enable the design and development of coherent hardware and software systems, collaborative software infrastructure, and high-performance software and algorithms. Structured R&D projects, undertaken in collaboration with DOE laboratories (Fermilab and Brookhaven National Lab) and integrated within the program, will provide students from all three participating universities with hands-on experience with cutting-edge computational tools, software and technology.
The training program will also include student professional development including oral and written science communication and cohort-building activities. These components are expected to help build a cohort of students with the goal of increasing recruitment and retention of a diverse group of graduate students.
“Future high energy physics discoveries will require large accurate simulations and efficient collaborative software,” said Regina Rameika, DOE Associate Director of Science for High Energy Physics. “These traineeships will educate the scientists and engineers necessary to design, develop, deploy, and maintain the software and computing infrastructure essential for the future of high energy physics.
Higgs @ Ten: UW–Madison physicists’ past and future roles
Posted on
Ten years ago, on July 4, 2012, the CMS and ATLAS collaborations at the Large Hadron Collider (LHC) at CERN — including many current and former UW–Madison physicists — announced they had discovered a particle that was consistent with predictions of the Higgs boson.
In the ten years since, scientists have confirmed the finding was the Higgs boson, but its discovery opened more avenues of discovery than it closed. Now, with the LHC back up and running, delivering proton collisions at unprecedented energies, high energy physicists are ready to investigate even more properties of the particle.
“The Higgs plays an incredibly important role in particle physics,” says Kevin Black, who previously worked on ATLAS before joining the UW–Madison physics department and is now part of CMS. “But for being such a fundamental particle, for giving mass to all elementary particles, for being deeply connected to flavor physics and why we have different generations of matter particles — we know a relatively small amount about it.”
Finding the Higgs particle had been one of the main goals of the LHC. The particle was first theorized by physicist Peter Higgs (amongst others, but his name was forever associated with it) in the 1960s.
“The basic idea was that if you just had electromagnetic and strong interactions, then the theory would have been fine if you just put a mass in by hand for the elementary particles,” explains Black. “The weak interaction spoils that, and it was a big question at the time of whether or not the whole structure of particle physics and of quantum field theory were actually going to be consistent.”
Higgs and others realized that there was a way to make it happen if they introduced a new field, which then became the Higgs field and the Higgs particle, that can interact with all other matter and give particles their mass. The Higgs particle, however, eluded experimental observation, leaving a gap in the Standard Model. In retrospect, one of the difficulties was that the mass of the Higgs — around 125 GeV — was much larger than the technology at the time could reach experimentally.
In earlier generations of experiments, UW–Madison physicist Sau Lan Wu participated in searches using the ALEPH experiment that placed a strong lower bound on the mass of the Higgs boson. Also at UW–Madison, Duncan Carlsmith, Matthew Herndon and their groups participated in searches at the CDF experiment that placed an upper bound on the mass of the Higgs boson and saw evidence of Higgs production in the region of mass where it was finally discovered.
This research set the stage for the experiments that were perfectly designed to discover the Higgs boson: the world’s most powerful hadron collider, the LHC, and the most capable pair of high energy collider experiments ever built, CMS and ATLAS.
The UW–Madison CMS group had three major projects: the trigger project led by Wesley Smith (now emeritus faculty), and the end cap muon system led by Don Reeder (now emeritus faculty) and Dick Loveless (now emeritus scientist), and a computing project led by Sridhara Dasu, who is current head of the group. Having made essential detector contributions, the UW–Madison CMS group, including Herndon, moved on to Higgs hunting and the discoveries. The group, now bolstered by the addition of Black and Tulika Bose to the physics department faculty, continues the work of understanding the Higgs Boson thoroughly.
The UW–Madison ATLAS group, founded and led by Wu, is an important leader of Higgs physics. The group is fortunate to attract another important leader of ATLAS, Higgs physicist Kyle Cranmer, who recently joined UW–Madison as physics department faculty and the director of the American Family Data Science Institute.
Both CMS and ATLAS announced the discovery, made separately but concurrently, in 2012. When it was first discovered, it conformed to expected energies and momentum of the Higgs, but finding it in this rare decay mode was unexpected, so LHC scientists called it the Higgs-like particle for a while.
At 3:00pm [on June 25, 2012], there was a commotion in the Wisconsin corridor on the ground floor of CERN Building 32. My graduate student Haichen Wang was saying loudly, ‘Haoshuang is going to announce the Higgs discovery!’ Our first reaction was that it was a joke; thus when we entered Haoshuang’s office, we all had smiles on our faces. Those smiles suddenly became much bigger when we got to look at the result of Haoshuang’s combination: It showed the 5.08s close to the Higgs mass of 125GeV/c2. Pretty soon, cheers were ringing down the Wisconsin corridor.
ATLAS had a discovery!”
The Higgs-like announcement from ten years ago has since been confirmed to be the Higgs particle. Several years later, Dasu’s group’s work saw the Higgs decay into the tau, and provided the first evidence of the particle coupling to matter particles, not just to bosons.
On the ten-year anniversary, both ATLAS and CMS collaborations published summaries of their findings to date and future directions. Experimental questions still being addressed include continuing to measure higher-precision interactions between the Higgs and particles it has already been observed to interact with, and detecting previously-unobserved interactions between the Higgs and other particles.
“One big question that immediately comes to my mind is the mass problem. The breakthrough generated by the Higgs discovery was that elementary particles acquire their masses through the Higgs particle,” Wu writes in her Physics Today essay. “A deeper question that needs to be answered is how to explain the values of the individual masses of the elementary particles. In my mind, this mass problem remains a big topic to be explored in the years to come.”
“Another one of the big things that we’re looking for in future data is to understand Higgs potential,” Black says. “Right now, by measuring the mass, we’ve only measured right around its ground state, and that has great implications for the stability of our universe.”
Also on the ten-year anniversary, CERN announced that the LHC — which had been shut down for three years to work on upgrades — was ready to again start delivering proton collisions at an unprecedented energy of 13.6 TeV in its third round of runs. It is expected that the ATLAS and CMS detectors will record more collisions in this upcoming run than in the previous two combined.
The LHC program is scheduled to run through 2040, and the UW–Madison scientists who are part of the CMS and ATLAS collaborations will almost certainly continue to play key roles in future discoveries.
UW–Madison’s current CMS collaboration members include Kevin Black, Tulika Bose, Sridhara Dasu, and Matthew Herndon, and their research groups. Current ATLAS collaboration members include Kyle Cranmer and Sau Lan Wu and their research groups.
Study of high-energy particles leads PhD student Alex Wang to Department of Energy national lab
In 2012, scientists at CERN’s Large Hadron Collider announced they had observed the Higgs boson particle, verifying many of the theories of physics that rely on its existence.
Since then, scientists have continued to search for the properties of the Higgs boson and for related particles, including an extremely rare case where two Higgs boson particles appear at the same time, called di-Higgs production.
“We’ve had some searches for di-Higgs right now, but we don’t see anything significant yet,” said Alex Wang, a PhD student in experimental high energy physics at UW–Madison. “It could be because it doesn’t exist, which would be interesting. But it also could just be because, according to the Standard Model theory, it’s very rare.”
Wang will have a chance to aid in the search for di-Higgs production in more ways than one. Starting in November, he will spend a year at the SLAC National Accelerator Laboratory as an awardee in the Department of Energy Office of Science Graduate Student Research Program.
The program funds outstanding graduate students to pursue thesis research at Department of Energy (DOE) laboratories. Students work with a DOE scientist on projects addressing societal challenges at the national and international scale.
At the SLAC National Accelerator Laboratory, Wang will primarily work on hardware for a planned upgrade of the ATLAS detector, one of the many detectors that record properties of collisions produced by the Large Hadron Collider. Right now, ATLAS collects an already massive amount of data, including some events related to the Higgs boson particle. However, Higgs boson events are extremely rare.
In the future, the upgraded High-Luminosity Large Hadron Collider (HL-LHC) will enable ATLAS to collect even more data and help physicists to study particles like the Higgs boson in more detail. This will make it more feasible for researchers to look for extremely rare events such as di-Higgs production, Wang said. The ATLAS detector itself will also be upgraded to adjust for the new HL-LHC environment.
“I’m pretty excited to go there because SLAC is essentially where they’ll be assembling the innermost part of the ATLAS detector for the future upgrade,” Wang said. “So, I think it’s going to be a really central place in the future years, at least for this upgrade project.”
Increasing the amount of data a sensor collects can also cause problems, such as radiation damage to the sensors and more challenges sorting out meaningful data from background noise. Wang will help validate the performance of some of the sensors destined for the upgraded ATLAS detector.
“I’m also pretty excited because for the data analysis I’m doing right now, it’s mainly working in front of a computer, so it will be nice to have some experience working with my hands,” Wang said.
At SLAC, he will also spend time searching for evidence of di-Higgs production.
Wang’s thesis research at UW–Madison also revolves around the Higgs boson particle. He sifts through data from the Large Hadron Collider to tease out which events are “signals” related to the Higgs boson, versus events that are “backgrounds” irrelevant to his work.
One approach Wang uses is to predict how many signal events researchers expect to see, and then determine if the number of events recorded in the Large Hadron Collider is consistent with that prediction.
“If we get a number that’s consistent with our predictions, then that supports the existing model of physics that we have,” Wang said. “But for example, if you see that the theory predicts we’d have 10 events, but in reality, we see 100 events, then that could be an indication that there’s some new physics going on. So that would be a potential for discoveries.”
The Department of Energy formally approved the U.S. contribution to the High-Luminosity Large Hadron Collider accelerator upgrade project earlier this year. The HL-LHC is expected to start producing data in 2027 and continue through the 2030s. Depending on what the future holds, Wang may be able to use data from the upgraded ATLAS detector to find evidence of di-Higgs production. If that happens, he also will have helped build the machine that made it possible.
Does the behavior of the Higgs boson match the expectations?
Posted on
Note: This story has been modified slightly from the original, which was published by the CMS Collaboration. Their version has some nice interactive graphics to check out, too!
The standard model of particle physics is our current best theory to describe the most basic building blocks of the universe, the elementary particles, and the interactions among them. At the heart of the standard model is a hypothesis describing how all the elementary particles acquire mass. Importantly, this scheme also envisages the existence of a new type of particle, called the Higgs boson. It took nearly 50 years, since its postulation, to observe the Higgs boson at the LHC experiments at CERN. It is strongly believed that the Higgs boson, the only scalar particle known to date, is a key to answer some of the questions that standard model cannot answer. Thus a detailed study of the properties of the Higgs boson is the order of the day. Often, specially at the LHC, one of the essential observables concerns the probability that a certain unstable particle is produced momentarily, albeit obeying the laws of nature. In experiments this production cross section is estimated using a specific decay final state of this transient particle in terms of the number of events over a given amount of time. The standard model predicts the cross section for the Higgs boson production as well as the decay rates very precisely. The frequency distribution of a given type of event, as a function of some of the measured variables in the experiment, helps us understand better various aspects of the interactions involved; they are typically lost in the summed or total cross section. Hence measurement of this differential cross section is a powerful tool to vindicate the standard model; also any deviation from the standard model predictions in data would indicate presence of a New Physics.
The Higgs boson is roughly about 125 times more massive than a proton and decays to lighter particles including cascade processes in some cases. Physicists typically use the signatures of stable particles in the detector to trace back suitable decay chains of the Higgs boson. The tau lepton is the heaviest lepton known so far, and as such it is the lepton with strongest ties to the Higgs boson. The probability of a Higgs boson decaying to a pair of tau leptons is reasonably high (about 6%), when compared, for example, to a pair of muons (about 0.02%). But the tau lepton is also an unstable particle and decays quickly to lighter particles always accompanied by its partner, the tau neutrino. Often the decay products from the tau lepton are hadrons producing a shower of particles or jet in the calorimeter system. The tau neutrino goes undetected affecting the accuracy of measurement of the tau lepton energy. It is interesting to study the detailed characteristics of the Higgs boson events using the decay to tau leptons which possess a rest mass of only about 1.4% that of the parent.
A recent study from the CMS Collaboration, focuses on the events where the Higgs boson decays into a pair of tau leptons using data collected by the experiment between 2016 and 2018. The analysis measures the Higgs boson production cross section as a function of three key variables: the Higgs boson momentum in the direction transverse to the beam, the number of jets produced along with the Higgs boson, and the transverse momentum of the leading jet. New Physics could manifest in excess of events in the frequency distribution of these variables when compared with the standard model predictions.
Says Andrew Loeliger, a UW–Madison physics grad student and one of the lead authors on the study:
The Higgs Boson is the most recent addition to the standard model of particle physics, discovered jointly between the CMS and ATLAS collaborations in 2012, so a big goal of the High Energy Physics field is to make very detailed measurements of its properties, to understand if our predictions are all confirmed, or if there is some kind of new physics or strange properties that might foreshadow or necessitate further discoveries. This work provides, what amounts to, a very fine grained consistency check (alternatively, a search for deviations in the amount) that the Higgs Boson is produced with the amounts/strengths we would expect when categorizing alongside some second interesting property (the transverse momentum of the Higgs Boson is a big one). This type of analysis had not been performed before using the particles we used, so it may open the door for far more precise measurements in places we may not have been able to do before, and a better overall confirmation of the Higgs Boson’s properties.
Other UW–Madison researchers involved in the study include former postdoc Cecile Caillol and Profs. Tulika Bose and Sridhara Dasu.
The analysis employs deep neural networks to exploit simultaneously a variety of tau lepton properties for identifying them with high efficiency. Eventually, to ensure that the selected tau lepton pair is produced from the decay of the Higgs boson and discard those from other processes, such as Z boson decay, the mass of the selected tau pair (m𝝉𝝉 ) is scrutinized. Reconstruction of m𝝉𝝉 , after taking into account the neutrinos involved in the decay as mentioned earlier, required a dedicated algorithm which computes, for each event, a likelihood function P(m𝝉𝝉) to quantify the level of compatibility of a Higgs boson process.
The Higgs boson typically has more transverse momentum or boost when produced in conjunction with jet(s), compared to the case when it is produced singly. One such event, collected by the CMS detector in 2018 and shown in Figure 1, could correspond to such a boosted Higgs boson decaying to two tau leptons which, in turn, decay hadronically. However, several other less interesting processes could also be the cause of such an event and pose as backgrounds. Such contributions have been measured mostly from the data itself by carefully studying the properties of the jets. Figure 2 shows the good agreement in the m𝝉𝝉 distribution between the prediction and data collected by the CMS experiment for the events with the transverse momentum of the Higgs boson below 45 GeV. The contribution from the Higgs boson process is hardly noticeable due to the overwhelming background. On the other hand, Figure 3 presents m𝝉𝝉 distribution for the events with highly boosted Higgs boson, when its transverse momentum is above 450 GeV. Selecting only events with high boost reduces a lot the total number of available events, but the fraction of the signal events in the collected sample is significantly improved. The data agrees with the sum of predicted contributions from the Higgs boson and all the standard model background processes.
This CMS result presents the first-ever measurement of the differential cross sections for the Higgs boson production decaying to a pair of tau leptons. Run 2 data is allowing us to scrutinize the Higgs boson in the tau lepton decay channel which was only observed a few years back. Future comparison and combination of all Higgs boson decay modes will offer better insights on the interactions of the Higgs boson to different standard model particles. But the story does not end here! The Run 3 of the LHC machine is just around the corner and looking into the future, the high luminosity operation (the HL-LHC) will offer a huge increase in data volume. That could perhaps provide hints of the question if the discovered Higgs boson is the one as predicted by the standard model or if there is any new interaction depicting another fundamental particle contributing to such measurements. That will indeed point to New Physics!
CMS Group publishes new study on Lepton flavor in Higgs boson decays
Posted on
Neutrinos mix and transform from one flavor to the other. So do quarks. However, electron and its heavier cousins, the muon and the tau, seem to conserve their flavor identity. This accidental conservation of charged lepton flavor must have a profound reason, or low-levels of violation of that conservation principle should occur at high energy scales. However, evidence for any charged lepton flavor violation remains elusive.
The CMS group recently published a new study on Lepton flavor in Higgs boson decays. At UW–Madison, the effort was led by Sridhara Dasu and postdoctoral researcher Varun Sharma, building off of work done by former postdoctoral researcher Maria Cepeda and former graduate student Aaron Levine.
The international CMS collaboration recently published a news story about this new study. Please read the full story here.