-- BenjaminWynne - 20-Oct-2009

ATLAS Software

The majority of the actual work involved in any analysis is trying to understand the software, which is largely based on the Athena framework. I'll add things here as I find them out.

Getting Started

You'll need access to ATLAS software, and some basic ideas about how to use it. This work book will help you with both: If you want to submit any jobs to the grid (you do!) then you should have a look at GANGA. This tutorial will help: Knowing your way around Root will be necessary at some point. I usually just dive straight into the Root documentation:

Annoying Terminology

Configuring any piece of ATLAS software is an unnecessarily complex process, and a lot of badly-defined terms are thrown around. The following represents the best of my understanding at present.

Job options file

This is an older method of setting up Athena, similar to Gaudi (on which Athena is based). If you have one, use it something like this
> athena.py path/to/jobOptions.py
Note that athena.py should be available in your $PATH. If not you haven't set up your software correctly and chances are nothing will work.

Job transform script

There are many of these scripts available in your $PATH (if you've set the software up correctly). Many of them will have names like csc_blahblahblah_trf.py. Each one has a different purpose, for example csc_atlasG4_trf.py is for Geant4 simulation of the detector. Use them like this:
> csc_blahblahblah_trf.py a huge number of command line arguments
The command line arguments are usually supplied as key/value pairs, rather than by order, eg
argumentName=argumentValue
Just trying to get them all in the right order would be ridiculous.

Job configuration fragment

These fragments are tiny python scripts, patching some behaviour of options files or transform scripts. A list of semi-official ones can be found here, as an example, for minimum bias scripts you would usually want to use jobConfig.LucidOn.py, which switches on the high pseudorapidity LUCID detector.

These fragments might be passed as an argument to a job transform script, eg:

> csc_blahblahblah_trf.py ... jobConfig=jobConfig.LucidOn.py
However, sometimes job options will contain include statements for these fragments as well, requiring that they be placed in the same directory as the job options file. I recently encountered
include("input_data.py")
Where input_data.py is a fragment simply listing the files containing the data to be read. The important point about these fragments is that they are far from standalone code, and assume the existence of many of the objects they act on.

The Full Chain

There are (roughly speaking) four stages in the process of making useful output from an event generator: generation, simulation, digitisation and reconstruction. I'll give a few recipes here for the different stages.

Generation

Generation is the creation of the event itself: the interaction of the two protons, and the subsequent processes. The result is a collection of vectors describing particles leaving the interaction point.

The ATLAS computing workbook gives the following example of running Pythia using a job options file jobOptions.pythia.py (which has been created already, stored in an ATLAS software package, and is retrieved using get_files):

> get_files PDGTABLE.MeV
> get_files jobOptions.pythia.py
> athena.py jobOptions.pythia.py > athena_gen.out

There are also many appropriate job transform scripts, such as (credit: Tim Martin):

> csc_evgen900_trf.py runNumber=005003 firstEvent=0 jobConfig=CSC.005003.pythia_sdiff.py maxevents=5000 randomseed=1212 outputevgenfile=pythia.pool.root
I should point out that in this case runNumber must be as shown, and maxevents must be >= 5000. Note the use of a configuration fragment: CSC.005003.pythia_sdiff.py. The "sdiff" indicates that this fragment causes the generation of single-diffractive events.

When using GANGA for anything like this, you shouldn't leave the application field of a job as "executable", but use "Trash.AtlasAthenaMC" instead. For example (credit: Tim Martin):

j=Job()
j.application=AthenaMC()
j.application.evgen_job_option='csc.005005.pythia_ndiff.py'
j.application.production_name='private'
j.application.process_name='pythia_ndiff'
j.application.run_number='005005'
j.application.firstevent=1
j.application.partition_number=1
j.application.number_events_job=5000
j.application.atlas_release='15.3.0.1'
j.application.mode='evgen'
j.application.transform_script='csc_evgen900_trf.py'
j.backend=LCG()
j.outputdata=AthenaMCOutputDatasets()
j.submit()

Simulation

Simulation takes the particles created by the event generator, and models how they interact with the detector, storing these interactions in a "hits" file.

The ATLAS computing workbook this time gives a job transform example for simulation:

> csc_atlasG4_trf.py inputEvgenFile=pythia.pool.root outputHitsFile=g4hits.pool.root maxEvents=10 skipEvents=0 randomSeed=1234 geometryVersion=ATLAS-GEO-06-00-00 physicsList=QGSP_BERT jobConfig=NONE > athena_sim.out 2>&1
Note that simulation requires a lot of CPU time, and access to some large (~300MB) databases (read here for more information).

An example more directly useful to minimum bias (credit: Tim Martin) would be:

> csc_atlasG4_trf.py inputEvgenFile=EVGEN.pool.root outputHitsFile=HITS.pool.root maxEvents=50 skipEvents=0 randomSeed=1234 geometryVersion=ATLAS-GEO-08-00-01 physicsList=QGSP_BERT jobConfig=SimuJobTransforms/VertexPos900GeV.py DBRelease=DBRelease-7.3.2.tar.gz conditionsTag=NONE IgnoreConfigError=False AMITag=NONE

A different transform, that covers both simulation and digitisation (credit: William Bell)

> csc_simul_trf.py -l WARNING inputEvgenFile=mc09.095001.pythia_minbias.evgen.EVNT.v15.5.X.Y._00020.pool.root outputHitsFile=mc09.095001.pythia_minbias.simul.HITS.v15.5.X.Y._00497.pool.root outputRDOFile=mc09.095001.pythia_minbias.digit.RDO.v15.5.X.Y_00497.pool.root maxEvents=200 skipEvents=4200 randomSeed=15483045 geometryVersion=ATLAS-GEO-08-00-01 digiSeedOffset1=2846750 digiSeedOffset2=97677 physicsList=QGSP_EMV jobConfig=VertexPos900GeVLowerTruth.py,jobConfig.LucidOn.py DBRelease=NONE triggerConfig=DEFAULT conditionsTag=NONE IgnoreConfigError=False AMITag=NONE

Digitisation

Digitsation is the modeling of the detector's response to a particle: given the interaction of the particle with the detector as described in the hits file, what will the data read out from the detector be? The result is an "RDO" file, which at this point is interchangeable with real data from the detector.

The ATLAS computing workbook again provides a job transform:

> csc_digi_trf.py inputHitsFile=g4hits.pool.root outputRDOFile=g4digi.pool.root maxEvents=-1 skipEvents=0 geometryVersion=ATLAS-GEO-06-00-00 digiseedoffset1=11 digiseedoffset2=22  > athena_digi.out 2>&1
Note that digitisation uses two separate random seeds (don't ask me why). Also notice maxEvents=-1, indicating that all events in the input file should be processed.

The job transform csc_simul_trf.py also handles digitisation, as described above.

Reconstruction

Reconstruction is the attempt to determine the properties of the particles that hit the detector. The process is equivalent for real or simulated data, and can have a number of different output formats (such as ESD, AOD, D3PD, etc.), which have undergone varying amounts of processing from the raw data, and store different amounts of information.

The ATLAS computing workbook suggests a simple job options file for reconstruction to ESD (find the file here):

> athena.py myRecOptions.py > athena_rec.out

A more complex job options file for reconstruction can be retrieved in lxplus like this (credit: William Bell):

> tar xvfz /afs/cern.ch/user/w/wbell/public/pythia_ddiff/MinBiasD3PD-sandbox-00001.tar.gz RdoToMbAnalysisTFile_withTracklets.py

A job transform that can cover all aspects of reconstruction is (credit: Prafula Behera):

> Reco_trf.py ['inputBSFile', 'inputRDOFile', 'inputESDFile', 'inputAODFile', 'maxEvents', 'autoConfiguration', 'preInclude', 'postInclude', 'preExec', 'postExec', 'topOptions', 'DBRelease', 'conditionsTag', 'geometryVersion', 'RunNumber', 'beamType', 'AMITag', 'projectName', 'trigStream', 'outputTypes', 'prescales', 'outputBSFile', 'outputESDFile', 'outputCBNT', 'outputPixelCalibNtup', 'outputMuonCalibNtup', 'outputTAGComm', 'outputAODFile', 'triggerConfig', 'HIST', 'outputNTUP_MUFASTFile', 'outputNTUP_TRIGTAUFile', 'outputHIST_TRIGEXPERTFile', 'RAW_IDPROJCOMM', 'DPD_EGAMMA', 'DPD_PHOTONJET', 'DPD_SINGLEEL', 'DPD_MUON', 'DPD_SINGLEMU', 'DPD_CALOJET', 'DPD_TRACKING', 'DPD_LARGEMET', 'DPD_MINBIAS', 'ESD_PRESCALED', 'DPD_PIXELCOMM', 'DPD_SCTCOMM', 'DPD_IDCOMM', 'DPD_IDPROJCOMM', 'DPD_CALOCOMM', 'DPD_TILECOMM', 'DPD_EMCLUSTCOMM', 'DPD_EGAMTAUCOMM', 'DPD_RPCCOMM', 'DPD_TGCCOMM', 'DPD_RANDOMCOMM', 'DPD_THINEL', '--ignoreunknown', '--athenaopts']
I haven't actually got this to work yet, these are just the possible command line arguments.
Edit | Attach | Watch | Print version | History: r5 < r4 < r3 < r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r5 - 2010-10-12 - JohannesElmsheuser
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback