This TWiki is still in the "guinea pig" stage...email me if you run into problems! (emily.thompson@cernNOSPAMPLEASE.ch)

Intro to TAG files

See talks:
http://indico.cern.ch/conferenceDisplay.py?confId=73119 (Validation of TAG_COMM and TAG output for MCP event selection)
http://indico.cern.ch/conferenceDisplay.py?confId=75756 (First Data Scanning)

Two main types: TAG and TAG_COMM


  • Built with PhysicsAnalysis/MuonID/MuonTagTools/src/CombinedMuonTagTool.cxx
  • Looks at merged containers in the AOD
  • CombMuonWord contains information on type of muon stored in bitword (SA, extrapolated, combined, tagged, calo). No algorithm information


  • Built with Commission/CommissionRec/src/ComTagWriter.cxx
  • Muons: Has information on SA only (MooreTracks, ConvertedMBoyTracks), Number of SA tracks, LeadingMuon1, 2 for the 5 track parameters

Get TAG_COMM (or TAG) data files


  • On lxplus (or afs), set up release:
     source setup.sh -tag=,AtlasTier0,slc4,gcc34,32,opt,runtime

  • In another terminal, set up Grid, dq2 and create proxy:

     source /afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh
     source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
     voms-proxy-init -voms atlas -valid 96:00

  • Look for the TAG or TAG_COMM files on AMI, or directly search on castor. Location on castor for 900 GeV runs:
      nsls /castor/cern.ch/grid/atlas/tzero/prod1/perm/data09_900GeV/<stream>/<run number>/

  • For example, take stream MuonswBeam and run 141270

         > nsls /castor/cern.ch/grid/atlas/tzero/prod1/perm/data09_900GeV/physics_MuonswBeam/0141270

  • look out for TAG (or TAG_COMM) files and ESDs
    • reco tags: f180 = AtlasTier0, f181 = AtlasTier0, etc....
  • It turns out to be good practice to choose your tag matching to your release, so in this case you might want to look at

  • Copy them from castor or get them with
     dq2-get data09_900GeV.00140541.physics_MinBias.merge.TAG.f175_m273

  • Once you have the TAG files, you must rename them all with .root at the end so that the macro can read them. Here is a simple script you can use to do that:

    if [ $# -ne 1 ]
      echo "Usage: ./give_root_name.sh [directory]."
      echo "Renames files in [directory] to have .root at the end."


    for file in `ls $directory`;do
      newname=`echo $file | sed s/$file/$file.root/`
      mv $directory/$file $directory/$newname

Local TAG_COMM (or TAG) analysis

For the instructions below, I've been looking at the MinBias stream in run 140541 (the run with our now-famous first collision!)

  • Get and untar attachments (at the bottom of this TWiki): TAGanalysis.tgz or TAG_COMManalysis.tgz
  • Contains everything you need to do TAG or TAG_COMM analysis
  • Edit runTAG.C (or runTAG_COMM.C) for input TAG (or TAG_COMM) files
  • TAGanalysis.C (or TAG_COMManalysis.C) is just a root macro built from MakeSelector
    • Contains the main selection code
    • As of now, there isn't much selection at all, but you can add it to the code as you like. TAG variables are in TAGanalysis.h (or TAG_COMManalysis.h)
  • CombinedMuonWord is a class needed for TAGanalysis.C to decode the CombMuonWord in the TAG files.

Select events from TAG_COMM (or TAG)

To run:

  > root -l
  [0] .x runTAG.C

  • Output: a file named EventsTAG.txt (or EventsTAG_COMM.txt) and events.txt (this "events.txt" may be named something else from the code in TAGanalysis.C or TAG_COMManalysis.C)
  • EventsTAG.txt (or EventsTAG_COMM.txt) has LumiBlock information, which is useful when copying ESDs (so that we don't have to copy all of them, just the ones with lumiblocks containing the events we find interesting).
    • For example, if one found interesting events in lumi blocks 154 and 296, one could dq2-get -f [file name with lb0154],[file name with lb0296] data09_900GeV.00140541.physics_MinBias.recon.ESD.f175
    • Again, if you want to know the names of the data sets, I would just look on castor using rfdir or nsls (using the full directory structure starting with /castor).
  • events.txt is just a text file with event numbers only, used for the filter in the next section.

Skimming ESDs with selected events from TAG (or TAG_COMM)

  • dq2-get or rfcp the ESDs of interest
  • Get and untar attachment: ESDanalysis.tgz
    • ESDAnalysis_combined.py is the top options, here you can turn VP1 on or off, etc...
    • readesd_combined.py is the event filter, and is read by ESDAnalysis_combined.py
    • Edit run140541_MinBias.py to add the ESDs you've copied locally. This file is read in by readesd_combined.py.
    • Copy events.txt to this directory, which is a list of events from your selection from the previous section. This is also read in by readesd_combined.py

To run:

  > source setup.sh -tag=,AtlasTier0,slc4,gcc34,32,opt,runtime
  > athena ESDAnalysis_combined.py

At this point, a "skimmed" ESD file will be created, and if VP1 is switched on, then only events in events.txt will be displayed.

Runs looked at using TAGanalysis.C

Here, using CombMuonWord, we count the total events in the run have:

MU: at least one muon seen by any algorithm
MU2: more than one muon seen by any algorithm
SA: at least one isStandAloneMuon()
SA2: more than one isStandAloneMuon()
EXTR: at least one hasMuonExtrapolatedTrackParticle() || hasInnerExtrapolatedTrackParticle()
EXTR2: more than one hasMuonExtrapolatedTrackParticle() || hasInnerExtrapolatedTrackParticle()
COMB: at least one isCombinedMuon()
COMB2: more than one isCombinedMuon()
TAG: at least one isLowPtReconstructedMuon()
TAG2: more than one isLowPtReconstructedMuon()
CALO: at least one isCaloMuonId()
CALO2: at least one isCaloMuonId()

141749 MinBias 19317 368 91 351 91 352 91 1 0 3 0 14 0
141707 MinBias 5433 104 36 104 36 104 36 0 0 0 0 0 0
141534 MinBias 11421 161 58 155 57 155 57 0 0 7 0 0 0
141226 MinBias 70212 739 268 473 188 473 188 0 0 314 68 6 0
140571 MinBias 24852 376 115 306 96 306 96 0 0 79 6 8 0
140541 MinBias 14290 303 58 193 51 193 51 0 0 48 0 69 3

Using TAG_COMM to select events directly from ESD

The following is a working recipe, validated with AtlasTier0 release.

  • look for available files on Castor:

    nsls /castor/cern.ch/grid/atlas/tzero/prod1/perm/data09_900GeV/<stream>/<run_number>/

  • create PoolFileCatalog for files on castor; replacing string "srm://srm-atlas.cern.ch" with "rfio:"
    dq2-ls -L CERN-PROD_TZERO -P -R "srm://srm-atlas.cern.ch^rfio:" data09_900GeV.00141270.physics_MuonswBeam.recon.ESD.f180

  • get python

    get_files -jo readesdusingtag.py

  • edit/add some lines to your likings, for example:
    • athenaCommonFlags.FilesInput=["rfio:/castor/cern.ch/grid/atlas/tzero/prod1/perm/data09_900GeV/physics_MuonswBeam/0141270/data09_900GeV.00141270.physics_MuonswBeam.merge.TAG_COMM.f180_m287/data09_900GeV.00141270.physics_MuonswBeam.merge.TAG_COMM.f180_m287._0001.1"]
    • rec.doVP1.set_Value_and_Lock(True)
    • rec.doWriteESD.set_Value_and_Lock(True)
    • athenaCommonFlags.PoolInputQuery.set_Value_and_Lock("(MooreSegments>0||ConvertedMBoySegments>0)&&(PixelTracks>0||SCTTracks>0||TRTTracks>0)")

  • Before running, you must make sure STAGE_SVCCLASS is set correctly. For the mentioned files do (in zsh):

    export STAGE_SVCCLASS=atlcal

  • Run
    athena readesdusingtag.py >! logfile.log &

  • I have some issues with the following error popping up at first try
    Py:inputFilePeeker    INFO Executing   inputFilePeeker.py
    Py:AthFile           INFO opening [rfio:/castor/cern.ch/grid/atlas/tzero/prod1/perm/data09_900GeV/physics_MuonswBeam/0141270/data09_900GeV.00141270.physics_MuonswBeam.merge.TAG_COMM.f181_m292/data09_900GeV.00141270.physics_MuonswBeam.merge.TAG_COMM.f181_m292._0001.1]...
    Py:inputFilePeeker WARNING Unable to open file rfio:/castor/cern.ch/grid/atlas/tzero/prod1/perm/data09_900GeV/physics_MuonswBeam/0141270/data09_900GeV.00141270.physics_MuonswBeam.merge.TAG_COMM.f181_m292/data09_900GeV.00141270.physics_MuonswBeam.merge.TAG_COMM.f181_m292._0001.1
    Py:inputFilePeeker   ERROR Unable to build inputFileSummary from any of the specified input files. There is probably a problem.
    Py:AutoConfiguration   ERROR No RunNumber stored in InputFile!
    Py:AutoConfiguration   ERROR No LumiBlock number stored in InputFile! Use 0
    Shortened traceback (most recent user call last):
I don't know why, but this is 'solved' by setting up your release again... Happy running!


In the 'normal' readesdusingtag.py file one runs reconstruction by default. It could be that for your purposes this is too slow. Therefore we have created a version which does not do reconstruction and therefore 'skims' the original ESDs, and in addition creates a CalibrationNtuple on the fly. For this python see:
   cp /afs/cern.ch/user/e/egge/public/readesdusingtagCalib.py .
   cp /afs/cern.ch/user/e/egge/public/MuonCalibConfig.py .

Using TAG_COMM to select events from ByteStream

Based on the experience for ESD described above, a similar approach is used for BS selection. Following the example stream and run from above we can use the RAW files from castor as well.
  • set up release, grid, dq2, STAGE_SVCCLASS as above
  • create new dir
    mkdir BS
    cd BS
  • choose TAG_COMM and RAW files to use, for this example with same stream and run as above:
  • create PoolFileCatalog for files on castor; replacing string "srm://srm-atlas.cern.ch" with "rfio:"
    dq2-ls -L CERN-PROD_TZERO -P -R "srm://srm-atlas.cern.ch^rfio:" data09_900GeV.00141270.physics_MuonswBeam.merge.RAW
  • create a file readbsusingtag.py with content (modify to your likings...):
    from AthenaCommon.AthenaCommonFlags import athenaCommonFlags

    from RecExConfig.RecFlags import rec
    include ("RecExCommon/RecExCommon_topOptions.py")

    from AthenaCommon.ConfigurationShelve import saveToAscii

  • Run:
    athena readbsusingtag.py >! logfileBS.log &

To Do

  • Need to set this up so that we can run the job options on the grid without having to copy ESDs locally
  • Can we write a "skimmed" ESD directly from TAG selection without doing the two-step process? Then the whole thing could be on the grid... Should be possible with information on https://twiki.cern.ch/twiki/bin/view/Atlas/CommissioningTag.

For Preema:

We'll do the skims in a series of steps (of course, a lot of this will be automated in the future, but in the mean time, what better automation is there than a hoard of grad students and post docs working around the clock in different time zones?)

Steps Overview:

  1. Set up your area on hal
  2. Check out TAG and TAG_COMM files for the run of interest.
  3. Create the TAG_MCP using the TAGMerger package
  4. Create events list from TAG_MCP for both "tight" and "loose" selection using TAG_MCPanalysis.C
  5. Create skimmed ESDs (dESD_MCP) by running ESDanalysis job options with athena

1. Setting up on hal

Unfortunately I don't have code checked in to svn yet, so we'll have to make do with getting the newest code from me in a tar ball. If you've already done some of the steps above under "Local TAG_COMM (or TAG) analysis" then I would suggest to get the latest versions of the code (ie: IM or email me and make sure I've put the newest ones up on this TWiki)

There is a new bit of code you have to set up now too, which makes the merged "TAG_MCP" ntuple. You'll need to create the package yourself, but here are the steps to do it:

Before you begin, download the following file (attached to this wiki):

  • requirements

In your working directory:

> mkdir generic
> cd generic

Copy the requirements file to this directory, and then:

> source /afs/cern.ch/sw/contrib/CMT/v1r20p20090520/mgr/setup.sh
> cmt config

The above only needs to be done once. However, every time a new terminal window is opened:

> source setup.sh -tag=,AtlasTier0,slc4,gcc34,32,opt,runtime

(warning: the above steps will be a little slow because of copying all the way from cern...)

Check out the ARA skeleton package:

> cmt co -r AthenaROOTAccess-00-05-58 PhysicsAnalysis/AthenaROOTAccess

Now create the new package, called TAGMerger (this is what will eventually be on SVN, but for now you have to put the package together yourself with all the pieces attached to this TWiki)

> cmt create TAGMerger TAGMerger-00-00-01 PhysicsAnalysis
> cd PhysicsAnalysis/TAGMerger/
> rm -r src

Copy and untar TAGMerger_h.tgz and src.tgz here:

> tar xvzf TAGMerger_h.tgz
> tar xvzf TAGMerger_src.tgz

> cd ../cmt/

  • copy requirements2 to this directory (there is already a requirements file there, but you will overwrite it with this one in the next step)

> mv requirements2 requirements (had to rename it so that I could attach two versions on this wiki)

> cmt config
> make

Now you should have the package working.

After every edit to the source file, you need to recompile in the cmt directory (this should not be necessary for now). In any case, to do this faster, type

> make QUICK=1

2. Get TAG and TAG_COMM files

see Get_TAG_COMM_or_TAG_data_files

3. Create the TAG_MCP

Here you will use the newly created TAGMerger package.

Download the attachment:

  • forest.C

Edit this file to add the TAG and TAG_COMM root files you have gotten in step 2. The "CollectionMetadata" tree is not too important at the moment (it isn't working the way it's needed, but for now, just put one TAG root file in the path for "meta_tree")

Now to create the new tree, do:

> root
[0] .x forest.C

4. Create events list for "tight" and "loose" selection

This is like Select_events_from_TAG_COMM_or_TAG, but with new code which merges both for TAG_MCP.

Download and untar the attachment:

  • TAG_MCPanalysis.tgz

Edit runTAG_MCP.C to add the path of your newly produced TAG_MCP ntuple.

Edit TAG_MCPanalysis.C for "tight" and "loose" selection. At this point, be sure to find out exactly what BCIDs are needed, as well as lumi blocks. Contact Domizia if you need help, but this was the info for the following runs:

142174 all LB are good
142191 LB 1-43, 131-247
142193 LB 25-162
all with BCID good crossings: 1, 101, 2774

The "loose" selection consists of only:

  if( !(PixelTracks > 2 || SCTTracks > 2 || TRTTracks > 2) ) failedSelection = true;
  if( !(NsctSPs > 0 || NpixSPs > 0) ) failedSelection = true;

The "tight" selection has the other cuts as well:

  if( RunNumber == [run number] && (LumiBlockN < [low LB bound] || LumiBlockN > [upper LB bound]) ) failedSelection = true;
  if(!(BCID==1 || BCID==889 || BCID==2674)) failedSelection = true;

  if( !(ConvertedMBoySegments > 0 || MooreSegments > 0 || CombinedMuonWord > 0 ) ) failedSelection = true;
  if( !(fabs(LArECtimeDiff)<10 && fabs(MBTStimeDiff)<10) ) failedSelection = true;
  if( !(1/fabs(TrackLead1_qoverp) > 2 ) ) failedSelection = true;
  if( !(PixelTracks > 2 || SCTTracks > 2 || TRTTracks > 2) ) failedSelection = true;
  if( !(NsctSPs > 0 || NpixSPs > 0) ) failedSelection = true;

5. Create skimmed ESDs

see Skimming_ESDs_with_selected_events

-- EmilyThompson - 30-Nov-2009

Topic attachments
I Attachment History Action Size Date Who Comment
Compressed Zip archivetgz ESDanalysis.tgz r3 r2 r1 manage 6.7 K 2009-12-14 - 01:02 EmilyThompson Look at selected events in VP1 and write out a "skimmed" ESD
Compressed Zip archivetgz TAGMerger_h.tgz r1 manage 14.8 K 2009-12-13 - 23:56 EmilyThompson  
Compressed Zip archivetgz TAGMerger_src.tgz r1 manage 4.0 K 2009-12-13 - 23:58 EmilyThompson  
Compressed Zip archivetgz TAG_COMManalysis.tgz r3 r2 r1 manage 5.4 K 2009-12-08 - 19:54 EmilyThompson Macros for selecting on TAG_COMM events
Compressed Zip archivetgz TAG_MCPanalysis.tgz r1 manage 9.6 K 2009-12-14 - 00:10 EmilyThompson  
Compressed Zip archivetgz TAGanalysis.tgz r3 r2 r1 manage 7.6 K 2009-12-08 - 19:55 EmilyThompson Macros for selecting on TAG events
C source code filec forest.C r1 manage 1.2 K 2009-12-14 - 10:21 EmilyThompson  
Unknown file formatext requirements r1 manage 0.5 K 2009-12-13 - 23:52 EmilyThompson  
Unknown file formatext requirements2 r2 r1 manage 0.5 K 2009-12-14 - 09:30 EmilyThompson  
Edit | Attach | Watch | Print version | History: r16 < r15 < r14 < r13 < r12 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r16 - 2009-12-14 - EmilyThompson
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback