Federico Carminati, computer physicist at CERN openlab, talks to Tushna Commissariat about career opportunities in high-energy physics and computing
As we celebrate the 30th anniversary of the World Wide Web, the spotlight is on pioneering computer scientist Tim Berners-Lee, who developed the concept at the CERN particle-physics lab near Geneva. He originally studied physics at the University of Oxford, but Berners-Lee is far from the only physicist to have had a fruitful career in computing. Just ask Federico Carminati, who is currently the chief innovation officer at CERN openlab. This is a public–private network that links CERN with other research institutions – as well as leading information and communications technology companies such as Google, IBM and Siemens – to investigate the potential applications of machine learning and quantum computing to high-energy physics.
With a keen interest in natural sciences from childhood – he begged his parents for a microscope, and wanted to study animals – Carminati’s interest in physics peaked towards the end of high school, thanks to his mathematical prowess. He graduated from the University of Pavia, Italy, in 1981 with a master’s degree in physics. Strangely, there were no Italian universities offering PhDs in physics at the time. “So, I decided to start working immediately. My first job was at the Los Alamos National Laboratory, where I worked as a high-energy physicist, on a muon-decay experiment,” Carminati says.
He spent a year working at Los Alamos, before his contract ended and was not renewed. “I began writing a number of letters looking for a job and, at the encouragement of my wife, I wrote to Nobel-prize-winning physicist Samuel Ting. Honestly, it was a very long shot and I didn’t think I was going to receive any answers,” he recalls. Luckily, the eminent physicist found Carminati’s CV interesting and wrote to ask him whether he was “better” at computing or hardware. “I said I was better in computing. So he put me in contact with the California Institute of Technology,” where Carminati spent the next year, before being hired by CERN in a computing role.
Carminati has been at CERN since 1985, where he has held a variety of jobs, the first of which was at the CERN Program Library, which handles the organization’s data. The library essentially started as a collection of programs written for physicists at CERN experiments. “But it became a worldwide standard for computing in high-energy physics, “ Carminati says. “My task was to co-ordinate the development of this very large piece of code, and to distribute it. This was before the Web existed, so distributing it meant shipping large reel tapes of data.”
Later, Carminati became responsible for one specific part of this library – the GEANT detector simulation program. The idea here was to carry out detailed and precise simulations of the very high-energy experiments they hoped to run on actual detectors in the future. Carminati worked on this until 1994. “I then decided to join the small team that was set up by the CERN director and Nobel-prize-winner Carlo Rubbia, who decided to start working on the design of a new kind of reactor that would combine the technology of nuclear-power reactors and of high-energy accelerators.” Carminati worked as part of this small team for the next four years, which proved interesting even though the team’s prototype never saw the light of day.
From 1998 to 2012, Carminati worked on the ALICE experiment, one of the four main detectors at the Large Hadron Collider (LHC). Among his roles was that of computing co-ordinator, which meant that he was in charge of designing, developing and co-ordinating the computing infrastructure for this experiment. “I was also very involved in the development of CERN’s computing grid,” he says.
Formidable requirements
Launched in 2002, CERN’s Worldwide LHC Computing Grid was a pioneering concept, as it allowed physicists across the globe to exploit the many petabytes of data generated each day when the LHC is running. It was key in allowing researchers to pin down the Higgs boson in 2012. While this global collaboration of computer centres is well established today – connecting more than 8000 physicists to thousands of computers in some 190 centres across 45 countries – it was a gargantuan task.
“We were asked to put down the computing requirements for the LHC on paper, in the late-1990s, and it emerged that they were so formidable that we were nearly accused of sabotaging the project,” exclaims Carminati, who says that it seemed as though the computing power needed was far beyond the funding provided to CERN. It was equally impractical to build and host such a large computing centre at the European lab.
The idea emerged to harness all of the computing facilities of the different laboratories and universities across the world that were already involved in the LHC, and integrate them into a single computing service. “Nowadays, everybody is talking about cloud computing, but at the time it was little more than science fiction,” says Carminati. “The interesting thing was that the funding agencies agreed to give us this computing power. But they wanted to make local investments by helping create centres of computing excellence in all the different countries,” he says, explaining that funders hoped that these centres would hire home-grown people and develop know-how in information technology locally. “It was a fantastic adventure because I travelled to places such as South Africa, Thailand and Armenia to help them set up computer centres.”
It was not all smooth sailing, though, as Carminati encountered “a tremendous amount of negotiation, and it took a lot of hard work to sell the concept to local politicians”. Carminati says a particular highlight was negotiating the South Korean computer centre for ALICE. “I had to work with the country’s ministry of science and education, and the local scientists too. I also did the same for India.”
Carminati’s role also involved helping users to exploit the computing power once it was established. “I was co-ordinator for ALICE’s experimental computing infrastructure between 1998 and 2012, which was much too long if you ask me. It was fun and creative until the LHC was switched on. When the machines started, it was exhausting. When the LHC is running we are in ‘production mode’,” says Carminati. “You have to be ready to process the vast amounts of data. It becomes a very complex organizational task where you have to co-ordinate hundreds of developers, distributed around the world, providing software to a central depository, all on a very tight time schedule. There are some very hard choices you have to make, when it comes to time versus quality, and the whole thing is tough and exhausting.”
Carminati spent 2013–2017 back at GEANT, working on improving the performance of the simulation program on new computing architectures, and developed the new generation of code used to simulate particle transport at CERN. During that time he also managed to obtain his PhD, from the University of Nantes in France.
Open for business
Now based at CERN openlab, Carminati explains that computing technologies are currently evolving so fast that evaluating them only once they are on the market is “not good enough”. CERN openlab is one of the few units at the European lab explicitly carrying out research into computing.
The aim is for CERN to reach out to high-energy physics users, as well as commercial users, to highlight techniques developed in-house, while also collaborating on projects with other institutions. For example, CERN openlab is currently working with Unosat, the UN’s technology platform that deals with satellite imagery and analysis, which has been hosted at CERN since 2002. One of their joint projects is to evaluate the movement of large refugee populations across the globe, to know how many people are at any given refugee camp, which can often be difficult or even dangerous to reach. One way to assess population density is to count the tents at a camp. “We are experts in machine learning and artificial intelligence,” says Carminati, “so we are working with Unosat to develop programs to automatically count the tents in satellite pictures.”
Another planned collaboration is with Seoul National University’s Bundang Hospital in South Korea, which Carminati says has a “fantastic health information system and patient records, from many years”. CERN openlab is trying to find the resources to begin a project with the hospital to “use machine learning to see whether we can correlate the classifications that artificial intelligence can make of patient records in actual diagnosis”. The idea is to find out if a machine-learning system could learn to make a diagnosis of its own, or pick up people who may have a “double diagnosis” – those with two diseases that are always coupled – thereby having a case for creating a new diagnostic category.
When it comes to the impact of quantum computing on the future of high-energy physics, Carminati is convinced that CERN must start thinking about tomorrow’s computing technology today. “It is so important to explore this, because 10 years from now we will have a shortage of a factor of 100, when it comes to computing time.” The other vital issue is the amount of data being taken at the LHC, and any future colliders, as they search for physics beyond the Standard Model. “We are now looking for something very subtle. With the Higgs we said that we were looking for a needle in a haystack. Now the new game is that we will make a stack of needles, and then look within it for the odd one out.”
This means that particle physicists will be taking and classifying an incredible amount of data, and then processing it with extreme precision, all of which will require an increase in computing power. “We may be increasing the amount of data that we take and the quality of the detectors. But we cannot expect our computing budget to be increased by a factor of 100,” says Carminati. “We are just going to have to find new sources of very fast computing, and quantum computing is a strong candidate.”
While he is clear that none of today’s quantum computers are anywhere near that mark, he believes that quantum computing will mature, partly thanks to investment by industry. “Whenever this happens, I think we have to be ready for it, to exploit it as best we can. We would be able to use a quantum computer across the board – from simulation and detector construction, through to data analysis and computing speed-ups. It is very important to start developing our programs and software now.”
Carminati points out that, were the quantum revolution to arrive, scientists would have to completely rewrite their codes, as he believes there is nothing like a universal quantum computer. “Can we have software that is agnostic of the specific kind of computing that we are using? We will have to develop a new angle, and so this is a large part of our research”. Last November Carminati organized the first ever workshop on quantum computing in high-energy physics at CERN, to get a head start on these very issues.
Outside the box
All of this means that today’s physics graduates will have a wide variety of opportunities when it comes to jobs across the fields. “A physicist is trained to creatively solve complex problems using mathematics, with a lot of thinking outside the box,” says Carminati. “We train so many physicists at CERN, and sometimes it is frustrating to see them leave, but we cannot keep everybody, this we know. Our consolation is knowing that we’re giving them a skillset that it is really applicable to many other research fields, and across industry.”
Today, there is a global hunger for machine-learning and quantum-computing experts, with countries from the US to India and China looking to train and develop such expertise. Carminati has a very optimistic outlook for today’s graduates who may be considering one of these fields. “Try to have as much constructive fun as you can in doing your research, because it’s a fascinating job.”