NSF to fund new $25M software institute to enable discoveries in high-energy physics

UniversityOfWashington's picture

A data visualization from a simulation of collision between two protons that will occur at the High-Luminosity Large Hadron Collider (HL-LHC). On average, up to 200 collisions will be visible in the collider’s detectors at the same time.ATLAS Experiment/CERN

At CERN’s Large Hadron Collider along the French-Swiss border, discovering new physics involves accelerating and smashing together beams of particles. These experiments have uncovered new particles, like the Higgs boson, and future experiments promise even more smashing revelations.

But to make these discoveries, scientists must meticulously comb through data on billions of particle collisions to find those rare, “interesting” and never-before-seen events that could reveal new layers to the Standard Model of particle physics — or upend it entirely. Data analysis on such a scale is already no easy task: Within a decade, after upgrades to the LHC boost its data-collection capacity by up to a factor of 100, it will overwhelm today’s software tools and algorithms.

To create the cyberinfrastructure needed to support the next generation of high-energy physics research, on Sept. 4 the National Science Foundation announced the creation of the Institute for Research and Innovation in Software for High Energy Physics, or IRIS-HEP. The institute, to be based at Princeton University, is a coalition of 17 research institutions, including the University of Washington, and will receive $25 million from the NSF over five years.

The primary goal of this institute is to change the way we look at data analysis in particle physics,” said UW physics professor Gordon Watts, who serves as deputy director of the institute.

IRES-HEP will develop computing software and expertise to enable a new era of discovery at the Geneva-based LHC, which is the world’s most powerful physics experiment.

Over the next eight years, the LHC will receive a series of major upgrades to sensors and other instruments known as the High-Luminosity Large Hadron Collider project, or HL-LHC. The HL-LHC experiments will look for dark matter, and more generally, search for new particles, interactions and physical principles. The $25 million funding over five years for IRES-HEP will drive innovations in data analysis and algorithms essential to handling the massive amounts of data generated by the HL-LHC.

A 2008 aerial image of the LHC site, which straddles the border between France and Switzerland, with major LHC and CERN installations outlined and labeled.CERN

“This is really big data with a capital B-I-G,” said Peter Elmer, director of the institute and a senior computational physicist at Princeton University. “This huge increase in data is needed to find the extremely rare ‘needle in a haystack’ signals that could indicate the presence of new physics phenomena. But to fully explore this data, we need much more powerful software tools and algorithms. We also need to maximally exploit the evolving high-performance computing landscape and new tools like machine learning, in which computers study existing data sets to learn rules that they can apply to new data and new situations.”

Projects at IRES-HEP will include developing machine-learning methods to process data from HL-LHC experiments, novel data-storage systems, techniques to transfer collision data from Switzerland to partner institutions across the globe, and algorithms to analyze data provided by different instruments in the collider. UW research within IRES-HEP will focus on distributive computing, a computational approach that divides a task or problem among multiple computers to solve it faster. Institute funds will help support several UW doctoral students or postdoctoral researchers in these endeavors, Watts said.

“These upgrades to the LHC are the equivalent of improving the resolving factor of a microscope by a factor of 10 or 100,” said Watts. “Now, we need to upgrade the camera and image processing to account for that better data — and that is where the research at IRES-HEP comes in.”

Together, these efforts are expected to prepare physicists for 2026, when the HL-LHC upgrades are expected to be completed and the collider can then produce 100 times more collision events than it can today.

“With these new analysis tools in place when the High-Luminosity Large Hadron Collider experiments begin, we’ll be ready when the data start to roll in,” said Watts. “Then, we can let physicists concentrate on physics, and not have to worry about the computer science.”

In addition to the UW and Princeton, the IRIS-HEP project will include participants from Cornell University; Indiana University; MIT; New York University; Stanford University; the University of California, Berkeley; the University of California, San Diego; the University of California, Santa Cruz; the University of Chicago; the University of Cincinnati; the University of Illinois at Urbana-Champaign; the University of Nebraska-Lincoln; the University of Michigan; the University of Puerto Rico, Mayagüez Campus; and the University of Wisconsin-Madison.

Copy this html code to your website/blog to embed this press release.

Comments

Post new comment

4 + 9 =

To prevent automated spam submissions leave this field empty.