ATLAS e-News
23 February 2011
Unleashing Athena
9 June 2008
Tier 0 - the CERN Computing Center
Alongside the hardware development, a team has been writing, testing, and modifying software to turn the signals from the detectors into data that may be stored and analysed. One of these teams, headed by ATLAS Software Project Leader David Quarrie, continually prepares software releases so that when the first protons collide, programmes will capture and analyse the data as seamlessly as possible.
Their software impacts multiple systems. A common framework, known as Athena, underlies the high-level trigger, calibration, and reconstruction systems. The trigger pares the data down to the 200 most interesting events per second, rejecting the vast majority of the collisions. Calibration software examines well-understood interactions, such as production of the Z bosons, and makes sure that ATLAS is seeing them correctly.
The component of the software package that reconstructs particle tracks and jets, making preliminary particle identifications, is part of the framework that allows analysts to verify particle identifications and deduce what happened in the collisions once the data is shipped out to ten remote Tier 1 data centers.
One of the challenges involved in developing this software is meeting the evolving needs of the collaboration. “It’s only at the end of writing it that you discover what you should have written,” says David. “It’s a continuous feedback process.” The 13th release ran in production for nine months before they discovered what David hopes is the last problem that adversely impacted some physics samples.
For that reason, they are currently working on the 14th release of the software, making small modifications and testing them nightly. Over the next few months, they will focus on technical performance of the software, reducing the memory consumption and increasing the speed of the programs.
“We believe that the physics performance is good enough for first data at this point,” David attests. The team is now concentrating on keeping their programs from crashing. In times of trouble, it’s possible to shunt raw data out to the Tier 1 sites, skipping the reconstruction. However, they can’t afford to let something fail on the data acquisition level.
The patch system allows the software experts to disable a certain part of the code without disrupting other areas. For instance, if trouble arose in processing data from one of the many detector subsystems, the processing for the others should continue unhindered while a fix was sought, tested and installed. Patched versions of the software are released every week or two.
Version 14.1 handled data from last week’s full dress rehearsal. Over the two-to-three months before first beam, they will continue to make the software smaller and faster for the 14.2 release. A few weeks or months after 14.2 has taken the first data, 14.3 will incorporate improvements in calibration and the algorithms that make the software run.
“I’d expect to keep a fairly high rate of releases over the first 12 months and then slow down,” says David. “Catch our breath and think about whether there are major things we think we need to restructure.”