Useful coding references

  • Reference for cmake commands with release 21: CMTCMakeRosettaStone
  • See ATLAS software Git tutorial for ATLAS introduction to GitLab
  • Useful git commands:
            git pull
            git checkout -b dev-my-rpc
            git push --set-upstream origin dev-my-rpc
            git commit FILE_NAME -m 'write here informative commit message'
            git push

  • Note: In USTC cluster, to use "git clone" with https, first run:
            kinit -f -r7d -A username@CERN.CH

PhysicsRPCProd production package

  • Please see README file in the git repository for instructions to compile and run production code:
  • ustc/PhysicsRPCProd package contains:
    • ReadMuon algorithm to read and save reconstructed muon information
    • ReadL1Muon algorithm to process reconstructed muons and to record RPC data/geometry
      • L1MuonExtrapolator tool to extrapolate muons to the RPC system and to record intersections with active RPC detector elements (gas gaps)
      • L1MuonGeometry tool to record RPC geometry, including global positions of RPC gas gaps and coordinates of RPC strips
    • Python code to configure and run athena job to read DESD data

  • Please do not forget to check local version against GitLab master and if necessary update local repository:
     $ cd $TestArea/PhysicsRPC/PhysicsRPCProd
     $ git status
     $ git pull

  • How to download DESD datasets:
      $ rucio list-dids data18_13TeV.*DESDM_MCP.*
      $ rucio download --nrandom=10 data18_13TeV.00358615.physics_Main.merge.DESDM_MCP.f961_m2024

  • Run athena command to make ntuples using local input files:
     $ athena $TestArea/PhysicsRPC/PhysicsRPCProd/share/ -c "inputDir='data18_13TeV.00358615.physics_Main.merge.DESDM_MCP.f961_m2024';dumpSG=False;EvtMax=1000" &> log &

  • Run athena command to make ntuples using EOS:
     $ athena $TestArea/PhysicsRPC/PhysicsRPCProd/share/ -c "inputDir='/eos/atlas/atlascerngroupdisk/det-rpc/data/DESDM_MCP/data17_13TeV.00327265.physics_Main.merge.DESDM_MCP.f832_m1816';dumpSG=False;EvtMax=1000" &> log &

  • Run grid production for one data run (replace "rustem" with your grid user name):
     $ cd $TestArea/PhysicsRPC/PhysicsRPCProd
     $ lsetup panda
     $ pathena share/ --inDS=data17_13TeV.00331875.physics_Main.merge.DESDM_MCP.f848_m1848 --outDS=user.rustem.data17_13TeV.DESDM_MCP.f848_m1848.prod_v01

RPC athena geometry code

  • PhysicsRPCProd contains the athena tool for accessing RPC geometry and readout information:
  • This tool relies on following ATLAS athena tools
  • MuonDetectorManager.h - manages geometry for all muon detectors
    • Provides list of RpcDetectorElements and RpcReadoutElements
    • RpcDetectorElement.h - describes single RPC doublet
    • RpcReadoutElement.h - describes single RPC module which contains list of gas gaps (panels)
    • RpcReadoutElement is the main class to access RPC geometry for the physical detector
  • RpcIdHelper.h translates RPC channel identifier into human readable information
    • ATLAS channel identifier corresponds to 64 bits unsigned integer defined in Identifier.h
    • This tool provides functions to function to extract bit wise information from identifier
    • This tool does not know anything beyond how to translate 64 bits identifier to station name, station phi, etc
  • MuonIdHelperTool.h provides translates Muon detector channel identifier into human readable information
    • This tool is less useful for RPC compared to above RpcIdHelper
  • Following functions of L1MuonGeometry.cxx read geometry information of entire RPC detector
    • L1MuonGeometry::readRpcGeometry() - iterates over all configured RpcReadoutElements to access all gas gaps
    • L1MuonGeometry::processGasGap() - process geometry information for one gas gap
    • L1MuonGeometry::checkSurfaces() - provides additional checks of RPC geometry to verify that readRpcGeometry() works correctly
  • Code that associates clusters (RPC or TGC) to segments made from MDT hits: MuonSegmentMakerTools/DCMathSegmentMaker
  • This is the reference geometry file for analysis:

PhysicsAnpRPC analysis package

  • PhysicsAnpRPC reads ntuples and produces analysis plots
  • Please see README in git repository for instructions for compiling and running this package: PhysicsAnpRPC
  • Run a command to print ntuple content:
     $ python $TestArea/PhysicsAnpRPC/macros/ out.root -n 10 --print-reco-event &> log &

  • Run an analysis example to make plots:
     $ python $TestArea/PhysicsAnpRPC/macros/ out.root -n 1000 -o rpc.root &> log &

Analysis workflow with PhysicsAnpRPC package

  • This package reads ntuples that contain:
    • RPC geometry information stored in m_rpc_geo_ branches
    • RPC detector hits stored in m_rpc_hit_ branches
    • Reconstructed muon information stored in m_muon_ branches
      • Extrapolated muon coordinates on matching RPC gas gaps stored in m_muon_RpcExtrapolate_ branches
      • RPC detector hits used to constructed muon track stored in m_muon_RpcHitOnTrackMS_ branches
    • This ntuple information is read by ReadNtuple algorithm and then copied into RecoEvent.h
  • These classes process geometry information for analysis:
    • PrepRpcGeo reads geometry information and copies this information into following structures:
    • RpcGeoStrip - local RPC strip coordinates with respect to the coordinate system that includes this strip
    • RpcGeoGap - global coordinates of the RPC gas gap and list of strips within this gap
    • This geometry information is read only once with the first event and then stored in RpcGeoMan - RPC geometry manager

RPC code references

BME analysis code - this code is now obsolete

  • PhysicsAnpRPC package contains code to read ntuples containing BME hits from secondary readout
    • - main python macro (no changes should be necessary in this code)
    • - module to configure ntuple reading algorithm and plotting algorithms
    • BmeHit.h - data class to store BME hit information (that was read from ntuples)
    • PrepBmeHit.cxx - algorithm to fill BmeHit objects
  • Commands to read ntuples with secondary BME hits
   $ cd $TestArea/PhysicsNtuple/PhysicsAnpRPC
   $ python macros/ /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_v18.01.20/data17_13TeV_00340368_ntuple/job_0020_out.root -o rpc.root --bme-event-map=/eos/atlas/atlascerngroupdisk/det-rpc/bme/secondary_ntuples_v17.12/RPC340368_map.txt --bme-geo=data/map_BME_athena_extended.txt -n 10000 --do-bme

AnpBatch package for managing batch jobs

  • AnpBatch contains scripts that help with managing batch jobs at CERN and USTC
  • This package is still under development
  • It is not necessary to setup ATLAS release to use package
  • Commands to check out this package:
   $ cd ~/testarea/
   $ git clone

This package takes as an input a text file list containing EOS or AFS paths to input files

  • This input text file can be generated with macro for grid datasets stored at CERN
  • Please first setup rucio clients using instructions here: RucioClientsHowTo
  • Then run these commands:
   $ cd testarea/AnpBatch/
   $ python macros/ data17_13TeV.*331875*DESDM_MCP* --rse=SCRATCH.* --protocol=root -o DESDM_MCP_run331875.txt

  • Input file has follow structure: file_path file_size
    • file_size is optional and can be absent
    • Example input file:
   $ cat /afs/

Running batch jobs at CERN

  • CERN uses LSF software for managing batch queues
  • There are two main macros for managing jobs:
    • prepares shell scripts for individual jobs and to submit them to LXBATCH
    • copies job's input files to local disk on worker node and copies output ROOT files
  • This command will simple commands for debugging purposes:
   $ source $HOME/testarea/AnpBatch/example/
   $ source $HOME/work/batch/job-config/

  • This command will run athena code to produce ntuples for RPC studies:
   $ source $HOME/testarea/AnpBatch/example/

  • Outstanding issues:
    • Currently acmSetup fails on batch worker nodes - need to check with Will Buttinger for advice

TODO for RPC Panel and Strip Efficiency - this needs updating

  • High priority tasks:
    • Make panel efficiency 1d projection overlap with 2017 runs(at least 5 combined) and 2018 runs(at least 5 combined)
    • Figure out why muons never go into some of the phi strips in BMS.
    • make efficiency comparison difference 1d plots for each sub-detector.
    • write the note for my qualification project -- on going
    • remake hit residual plots using muons with pT > 20 GeV and also separately for positive/negative muons in positive/negative eta semi-spheres
    • move eta phi combined string to z-title
    • plans of 2018
  • Low priority tasks:
    • compare the extrapolating position between ID and MS muon by muon
    • analyse unbiased muons(triggered by b jet trigger, met trigger ... ) and make comparisons between biased muon(triggered by RPC trigger) and unbiased muons
    • analyse muons passed the 20 GeV threshold; extrapolation from Inner detector and make separate website to present the different analysis
    • efficiency vs DCS status ,HV, threshold etc.
    • ZTP candidates
    • put the efficiency map to COOL
    • 2d panel efficiency map with bin size related to the real size
    • make bins in the effciency map clickable
    • give information of inefficiency panels got from ANP to Giulio (to validate the framework directly)
    • to get the extrapolating sigma
    • look at the cuts to find out the reason of the ~ 0.02 difference (0.02 due to noise is a little too large)
    • improve my macro - write a new module
    • panel_ids multiply after hadding the root files
  • Finished tasks:
    • 2018-03-29: extrapolate the muon track from Inner detector track to depress the bias in phi direction -- Rustem
    • 2018-03-15: 2d efficiency map for strip square for each panel
    • plots for each sub detector , BML, BOS, BOG and sub-sub detector, BML1, BMS2, etc.
    • list of eff_anp and eff_ref value comparison
    • make problematic panels(with large residuals) list
    • make strip efficiency map
    • Plot out of time fraction versus strip number
    • Plot strip efficiency versus strip number
    • hit residual for per hit
    • get rid of the warnings when run the ANP
    • make InTime hit residual plots for muons with cluster size 1 and panels with cluster size 2 separately -- Rustem/Heng
    • check codes for error when comparing Anp efficiency using all hits and reference efficiency
    • finish the rpc hits cluster class
    • make ROOT TTree/text file for each run with panel efficiency, error, DCS status, mean and RMS of geo residual, mean time and out of time fraction
    • make problematic panels(with large residuals) list and check if there are cables errors in these panels
    • add trigger efficiency and timing plots to our framework, include our framework into the reference one and analyse all runs in 2017 on tier0 -- Rustem

Ntuple production record

  • Ntuples and LOG location: /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_v18.01.20, PhysicsRPCProd version: v18.01.20
  • Ntuples and LOG location: /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_v18.03.12, PhysicsRPCProd version: v18.03.12
  • Ntuples and LOG location: /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_T0-prod-2018-03-29, PhysicsRPCProd version: T0-prod-2018-03-29
  • Ntuples produced by Tier0 for 2017 Runs in USTC cluster: /moose/AtlUser/liheng/NTUP_L1RPC_2017
  • Ntuples produced by Tier0 for 2018 Runs in USTC cluster: /moose/AtlUser/liheng/NTUP_L1RPC_2018
Edit | Attach | Watch | Print version | History: r48 < r47 < r46 < r45 < r44 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r48 - 2021-02-02 - RustemOspanov
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback