23 February 2011
2011: Let's go!
23 February 2011
The LHC's annual Chamonix meeting took place at the end of last month and, aside from the concerning lack of fresh fluffy snow, there were some very serious points up for discussion. In the wake of the meeting, we now know: The LHC will postpone its long refurbishment shutdown for an extra year, until 2013; and, for 2011 at least, the beam energy will stay fixed at 3.5 TeV.
Each of the experiments presented their case for whether or not the machine should increase to beam energies up to 4 TeV for the coming year. ATLAS's Bill Murray reported that all the LHC experiments were in favour of this, on the condition that it would not compromise the machine. ATLAS and CMS studies show, he said, that running at 4 TeV would offer a gain equivalent to 20 per cent more Higgs candidates in a search for a Higgs with a mass of around 120 GeV and almost 100 per cent more signal events for certain exotic searches and SUSY particles.
Whilst upping the energy was judged positive for the physics programmes of the experiments, there were a number of in-depth presentations from the machine side, which gave everyone a pause. Detailed simulations of different quench scenarios concluded that if the machine runs with beam energies of 4 TeV, the chance of a burnout similar to that of 2008 is something of the order of 0.5 per cent.
That seems pretty small but, as ATLAS's Herman ten Kate summed up well, not if you consider the risk being proposed: You would not, he pointed out, board a plane if you were forewarned that there was a half a per cent chance that the plane would crash. It's just not a chance worth taking. CERN, the experiments and the machine were agreed, then next year the LHC will run at 3.5 TeV.
“I don't feel this is an important limitation,” considers ATLAS Physics Coordinator, Aleandro Nisati. Although higher energy beams would extend the range of masses that could be explored for new physics, precious data-taking time could be lost to the new energy set-up phase, and could result in less data collected by the end of the year. “I think that the physics potential of a 7 TeV machine delivering high integrated luminosity is really very important. And I don't think that we are paying that much by not going to 8 TeV.”
The second big decision whether to run in 2012 followed on neatly from the first. With five inverse femtobarns of data, the whole mass spectrum where the Higgs is expected to lie could be searched for statistical 'evidence' the first hints of the particle. The Tevatron's move not to run beyond 2011 puts them out of the running for an outright Higgs discovery, but CERN is well aware that they could still be in a position to claim 'evidence' of it before the year is out.
“It's clear that the race with Fermilab is not yet finished,” says ATLAS Deputy Run Coordinator Martin Aleksa. In light of the Tevatron's decision to shut up shop, the argument for the LHC to run in 2012 is no longer based on competition for a full-blown Higgs discovery. Nevertheless, says Martin: “It became clear that five to ten inverse femtobarns is something we'd really ought to get before we go to a long shutdown, because we have seen that this amount of data will give us significant new insights.&rdquo As a conservative estimate, the LHC folks are projecting an integrated luminosity of one inverse femtobarn for the next year, although if things progress well this could rise to two, three, or maybe even four, at a stretch. For the magic five though, running in 2012 was deemed necessary. “To be sure that LHC can really say something about the Higgs before the long shutdown, I think we need these two years," says Martin.
“The prospect of having certainly one inverse femtobarn of integrated luminosity by the end of this year, and significantly more by the end of next year, is extremely attractive and important,” says Aleandro. “Other than the Higgs, we have a very rich plan of physics studies, in particular searches for new physics SUSY, new heavy bosons, leptoquarks. The most sensitive channels will be permanently under observation; we'll be continuously watching the data to see if something new, funny, or interesting appears. But we'll also continue to explore the new data as much as we can at 360°.”
Beams are due back in the machine before the start of March, and a swift ramp-up should see the machine filled with a record 936 bunches and peak instantaneous luminosities of the order of 1033 by mid-May. “If we then run for the 130 days which are now foreseen for proton physics, we could collect something like three inverse femtobarns of data this year,” says Martin. During the ramp-up, a final decision will be taken on the bunch spacing beginning with them at 75 nanosecond intervals, but possibly progressing to 50 nanosecond bunch spacing, meaning a total of 1404 bunches in the machine by the end of the intensity ramp. The beams will also be more focussed within the experiments this year, due to more intense squeezing.
“ATLAS is built for much higher peak luminosities than we expect for the coming year. However, every experiment needs a certain ramp-up phase where we're learning,&rdquo says Martin. “If we really go to 1033 in the next few months, this is a rather quick ramp-up and we'd better anticipate anything that can come up.”
The looming problem of pile-up with up to 15 proton interactions per bunch crossing (of which there will be around a million per second), compared to last year's average of four as well as the question of which objects to trigger on at higher luminosities, has been occupying the Trigger group for months now. They have been working on the new Trigger Menu pre-emptively since last October. “Of course, it was in good hands, and this work is almost finished,” says Martin.
There is also an ongoing effort to maximise the recording rate how many events per second can actually be written to disk. At luminosities of 1033, ATLAS's baseline recording rate of 200 Hertz would already mean giving up some good events. To avoid compromising physics, the plan is to optimise the amount of different data types stored on disk, so that a larger number of events can be stored for offline studies.
Over in the Control Room, things will soon be looking lively too. All the subsystems migrated to the new TDAQ software version over the winter break and, kinks now ironed out, they've just begun running together in combined mode to make sure everything is ready for first collisions in mid-March.
The Control Room will gradually start filling up again over the next couple of weeks, although it won't get quite as busy as last year; ATLAS will start off with 13 shifters, down from last year's 16, and this can hopefully be further reduced as more is learned about the detector and more processes can become automatic. The newly-free desks will be reserved for experts, so there is always a spot for them in the room if they need to check on something. “Reducing is a slow process, I saw it myself,” laughs Martin.
Martin will take over from Benedetto Gorini as Run Coordinator at the end of the month, and Thilo Pauly will come in as his new Deputy. For now, Martin is all smiles about the exciting year ahead: “Expectations are high, because last year we had 93.6 per cent efficiency” he says. “Let's see ”