Triggers under fire

30 November 2010

Trigger experts, together to thrash out improvements at the October workshop in Amsterdam



“In 2010 we – the Collaboration – had an easy life in terms of trigger … but times change, and next year we'll have to make some tough choices,” warns Trigger Menu Coordination group leader Olya Igonkina. With 2011 and its promise of high luminosity tiptoeing closer, the design of next year's trigger menu is a hot topic. Trigger and physics working groups are all engaged right now, trying to converge on the same thing: optimised triggers which produce the best possible signal to background ratio with the available computing resources.

“Before it was easy – no cutting into the high Pτ physics,” says Olya. “We had of course to commission and understand [the trigger], but thanks to the CSC book effort … we knew exactly what to aim for, and what physics analyses we wanted to keep.” After getting the trigger off the ground, they had to move quickly with the leaping luminosity, tweaking and prescaling triggers on the fly, to cope with the ever-changing data taking landscape.

By the end of 2010 proton-proton running, instantaneous luminosities were hitting values a few hundred thousand times greater than at the start of the data-taking period. Since the LHC wasn't operating at full capacity, data could be written at a rate almost double the nominal 200 Hz, with looser physics selections and Tier 0 using the gaps in the LHC beam time to catch up and process the excess.

This isn't a sustainable solution though. “Next year we expect up to 2 x 1033 instantaneous luminosity [an order of magnitude higher than the maximum achieved in 2010] and possibly 8 TeV running. This is a very big step forward for the trigger, and most physics analyses will be affected,” cautions Olya, adding: “A lot of work needs to be done before January to ensure that we record all valuable data and do not miss any discoveries.”

Here the physics groups and trigger groups represent two sides of the same optimisation coin; the former must work out what kind of triggers they want and where they really can't afford to compromise, and the latter must work out if these demands are realistic and how to cater for them. The two sides are in constant dialogue through the trigger menu coordination group, searching for the best solutions under the glare of increasing luminosity.

“The choice of the trigger threshold is dominated by the rate, not by physics,” says Diego Casadei, who coordinates the missing transverse energy (Eτ) trigger signature group. “Of course we dream about selecting good events, but if our trigger is taking too many events, it will simply be switched off because it is imposing too much dead time on the system.”

Diego works on a particularly tricky trigger, explains Olya: “Refining missing transverse energy at the High Level Trigger needs a large amount of data and must read the full calorimeter. This cannot be done at Level 2 due to limited time and resources, so the rate has to be cut already at Level 1 with a very high missing Eτ threshold.”

The problem with that, says Diego, is that even though most events have negligible real missing Eτ, fluctuations due to the measurement resolution at Level 1 generally induce some value for it. “This fake missing Eτ completely dominates the rate we see,” he explains.

The usual solution is to raise the threshold and only trigger on events with a missing Eτ above a higher value. But this also means losing more and more sensitivity to interesting physics. Imagine going to a party and speaking only to those people who are shouting the loudest. It certainly doesn't guarantee you the most interesting conversations with the most captivating people…

Missing Eτ is a key variable for identifying new physics in Supersymmetry, says Teresa Fonseca Martin, who acts as the contact person between the Trigger and the SUSY group. “It's one of the most promising variables, which everybody looks at,” she explains.

Since missing Eτ is used to distinguish signal from background in these analyses, ideally the trigger would be something completely independent. As a trigger though, there are certain scenarios where missing Eτ proves irresistibly powerful for rejecting uninteresting events. The challenge then is to make use of it whilst keeping the threshold as low as possible, thereby allowing through as much interesting data as possible.

“The most promising and ambitious trigger for zero lepton and missing Eτ SUSY searches seems to be the jet plus missing Eτ trigger,” says Teresa. “You want to go as low as possible in the Eτ of the jets you accept, so that you can probe a larger phase space [essentially the range of potential masses of undiscovered SUSY particles],” she explains, adding: “lower jet thresholds become accessible in the offline analysis thanks to the use of missing Eτ in the trigger.”

To make this possible, the missing Eτ trigger group are working on two new approaches to applying the trigger. The first – computationally simple but conceptually challenging – seeks to “use existing information, but in a new way which is hopefully smarter than what we do now, to improve the signal to background ratio,” says Diego. The second improvement is more fundamental, and involves coming up with new algorithms and changing the way the system runs. The trigger and physics groups are working to study both these solutions and are optimising the conventional Missing ET selection to ensure that there is a robust trigger ready for next year.

Developing combined triggers, which ask for more than one physics object at once, is another way to screen out unwanted events, leaving more bandwidth for the interesting data. But it's a delicate operation, says Teresa: “You make a trade: The more specific the trigger, the lower Eτ you can go to. But then you have to be careful of not making any mistake, because you have to be sure afterwards that your trigger didn't introduce any bias that you weren't expecting or are not able to study later.”

This trigger-physics balancing act is being played out across the board right now, with physics groups studying how threshold cuts could affect their data and working out how to optimise the signal to background ratio in their analyses, and trigger groups trying to produce and refine the best possible triggers to serve them.

“Fixing a bug takes not less than a few days, because we have a long list of checks [that fixes must then go through]. Having new ideas and making them reality may take months. We started seriously working on new ideas one month ago,” says Diego.

“The trigger groups design good triggers, but they need to know what to focus on,” says Olya, who is appealing for input from the Collaboration to help design the best possible trigger menu before data taking begins again in the New Year. “We're a collaboration, so we have to make the choices that are best for everybody.”

“We very much welcome any relevant studies,” she adds. “Any new ideas on what trigger combinations could be useful are also very welcome. The trigger experts, and in particular trigger liaison representatives in every physics group are there to answer questions on what is realistic and how to look at triggers.”

A skeleton trigger menu for a luminosity of 1033 cm-2s-1 based on 2010's experience has been put together. Work is ongoing to study and optimise all triggers before the start of next year's run, including e/gamma, muon, tau, jets, B-physics, and b-jet triggers. If you want to join the effort, a list of physics analysis which need help with trigger studies can be found here, and a list of trigger liaison representatives can be found here.


   

Ceri Perkins

ATLAS e-News