Introduction

  • This page describes USTC software for analysing ATLAS collisions data to study RPC detector and trigger performance
  • ustc/PhysicsRPCPlot - Git project for python code to make plots (described in dedicated TWiki: PhysicsRPCFramework)
  • ustc/PhysicsRPCProd -Git project for athena code to analyse DESD collisions data and make ntuples
  • ustc/PhysicsAnpRPC - Git project for Anp analysis code to analyse ntuples

  • ustc/PhysicsRPCProd package contains:
    • ReadMuon algorithm to read and save reconstructed muon information
    • ReadL1Muon algorithm to process reconstructed muons and to record RPC data/geometry
      • L1MuonExtrapolator tool to extrapolate muons to the RPC system and to record intersections with active RPC detector elements (gas gaps)
      • L1MuonGeometry tool to record RPC geometry, including global positions of RPC gas gaps and coordinates of RPC strips
    • Python code to configure and run athena job to read DESD data

How to setup PhysicsRPC with release 21

  • Please run these commands only once:
      $ mkdir -p ~/testarea/RPCRel21/source
      $ cd ~/testarea/RPCRel21/source
      $ git clone https://:@gitlab.cern.ch:8443/ustc/PhysicsRPCProd.git
      $ source PhysicsRPCProd/macros/setup/first_setup_acm21.sh

  • Afterwards, setup Athena release using these commands:
      $ cd ~/testarea/RPCRel21/source
      $ source setup_atlas_athena.sh
      $ cmake --build $TestArea/../build

  • How to commit code to GitLab master
    • See ATLAS software Git tutorial for ATLAS introduction to GitLab
    • Below, use "macros/plotL1Trigger.py" as example
    • "git pull" updates local repository to master repository
    • "git commit" commits changes to local Git repository
    • "git push" pushes local repository to master repository at ustc/PhysicsRPC
      $ git pull
      $ git commit FILE_NAME -m 'write here informative commit message'
      $ git push

Ntuple production with PhysicsRPCProd

  • Setup ATLAS software release:
     $ cd ~/testarea/RPCRel21/source
     $ source setup_atlas_athena.sh

  • Check local version against GitLab master and if necessary update local repository:
     $ cd $TestArea/PhysicsRPC/PhysicsRPCProd
     $ git status
     $ git pull

  • Copy DESD file to your local machine (you need 1.8 GB of free disk space):
     $ scp -r lxplus036:/tmp/rustem/data17_13TeV.00327265.physics_Main.merge.DESDM_MCP.f832_m1816 .

  • How to download DESD datasets:
      $ rucio list-dids data16_13TeV.*DESDM_MCP.r9264_p3082_p3082* | grep CONTAINER
      $ rucio download --nrandom=10 data16_13TeV.00298862.physics_Main.merge.DESDM_MCP.r9264_p3082_p3082

  • Run athena command to make ntuples using local input files:
     $ athena $TestArea/PhysicsRPC/PhysicsRPCProd/share/readDESD_MCP_PhysicsAnpRPC.py -c "inputDir='data17_13TeV.00327265.physics_Main.merge.DESDM_MCP.f832_m1816';dumpSG=False;EvtMax=1000" &> log &

  • Run athena command to make ntuples using EOS:
     $ athena $TestArea/PhysicsRPC/PhysicsRPCProd/share/readDESD_MCP_PhysicsAnpRPC.py -c "inputDir='/eos/atlas/atlascerngroupdisk/det-rpc/data/DESDM_MCP/data17_13TeV.00327265.physics_Main.merge.DESDM_MCP.f832_m1816';dumpSG=False;EvtMax=1000" &> log &

  • Run grid production for one data run (replace "rustem" with your grid user name):
     $ cd $TestArea/PhysicsRPC/PhysicsRPCProd
     $ lsetup panda
     $ pathena share/readDESD_MCP_PhysicsAnpRPC.py --inDS=data17_13TeV.00331875.physics_Main.merge.DESDM_MCP.f848_m1848 --outDS=user.rustem.data17_13TeV.DESDM_MCP.f848_m1848.prod_v01

PhysicsAnpRPC analysis package

  • PhysicsAnpRPC reads ntuples and produces analysis plots
  • This package is still under development

Setting up PhysicsAnpRPC code with release 20.7 for the first time

  • This section describes how to set up this code for the first time using release 20.7
  • Please run these commands only once and then exit shell:
   $ mkdir -p ~/testarea/RPCBase20
   $ cd ~/testarea/RPCBase20
   $ git clone https://:@gitlab.cern.ch:8443/ustc/PhysicsAnpRPC.git
   $ source PhysicsAnpRPC/macros/setup/first_setup_rel20.sh 
   $ exit
  • Note: In USTC cluster, to use "git clone" with https, first run:
   $ kinit -f -r7d -A username@CERN.CH

Running PhysicsAnpRPC code with release 20.7 for regular analysis work

  • This section describes how to run code for regular analysis work using release 20.7
   $ cd ~/testarea/RPCBase20
   $ source setup_atlas_analysis_release.sh

  • How to recompile the code from scratch:
     $ cd $TestArea/PhysicsAnpRPC/cmt
     $ cmt bro rm -r ../x86_64-slc6*
     $ cmt bro make -j8

  • Run a command to print ntuple content:
     $ python $TestArea/PhysicsAnpRPC/macros/runPanelEff.py out.root -n 10 --print-reco-event &> log &

  • Run an analysis example to make plots:
     $ python $TestArea/PhysicsAnpRPC/macros/runPanelEff.py out.root -n 1000 -o rpc.root &> log &

Analysis workflow with PhysicsAnpRPC package

  • This package reads ntuples that contain:
    • RPC geometry information stored in m_rpc_geo_ branches
    • RPC detector hits stored in m_rpc_hit_ branches
    • Reconstructed muon information stored in m_muon_ branches
      • Extrapolated muon coordinates on matching RPC gas gaps stored in m_muon_RpcExtrapolate_ branches
      • RPC detector hits used to constructed muon track stored in m_muon_RpcHitOnTrackMS_ branches
    • This ntuple information is read by ReadNtuple algorithm and then copied into RecoEvent.h
  • Next, PrepRpcGeo reads geometry information and copies this information into following structures:
    • RpcGeoStrip - local RPC strip coordinates with respect to the coordinate system that includes this strip
    • RpcGeoGap - global coordinates of the RPC gas gap and list of strips within this gap
    • This geometry information is read only once with the first event and then stored in RpcGeoMan - RPC geometry manager

RPC geometry code

  • PhysicsRPCProd contains the athena tool for accessing RPC geometry and readout information:
  • This tool relies on following ATLAS athena tools
  • MuonDetectorManager.h - manages geometry for all muon detectors
    • Provides list of RpcDetectorElements and RpcReadoutElements
    • RpcDetectorElement.h - describes single RPC doublet
    • RpcReadoutElement.h - describes single RPC module which contains list of gas gaps (panels)
    • RpcReadoutElement is the main class to access RPC geometry for the physical detector
  • RpcIdHelper.h translates RPC channel identifier into human readable information
    • ATLAS channel identifier corresponds to 64 bits unsigned integer defined in Identifier.h
    • This tool provides functions to function to extract bit wise information from identifier
    • This tool does not know anything beyond how to translate 64 bits identifier to station name, station phi, etc
  • MuonIdHelperTool.h provides translates Muon detector channel identifier into human readable information
    • This tool is less useful for RPC compared to above RpcIdHelper
  • Following functions of L1MuonGeometry.cxx read geometry information of entire RPC detector
    • L1MuonGeometry::readRpcGeometry() - iterates over all configured RpcReadoutElements to access all gas gaps
    • L1MuonGeometry::processGasGap() - process geometry information for one gas gap
    • L1MuonGeometry::checkSurfaces() - provides additional checks of RPC geometry to verify that readRpcGeometry() works correctly
  • Code that associates clusters (RPC or TGC) to segments made from MDT hits: MuonSegmentMakerTools/DCMathSegmentMaker

RPC code references

BME analysis code

  • PhysicsAnpRPC package contains code to read ntuples containing BME hits from secondary readout
    • runBMEHits.py - main python macro (no changes should be necessary in this code)
    • PhysicsAnpRPCBMEHits.py - module to configure ntuple reading algorithm and plotting algorithms
    • BmeHit.h - data class to store BME hit information (that was read from ntuples)
    • PrepBmeHit.cxx - algorithm to fill BmeHit objects
  • Commands to read ntuples with secondary BME hits
   $ cd $TestArea/PhysicsNtuple/PhysicsAnpRPC
   $ python macros/runPanelEff.py /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_v18.01.20/data17_13TeV_00340368_ntuple/job_0020_out.root -o rpc.root --bme-event-map=/eos/atlas/atlascerngroupdisk/det-rpc/bme/secondary_ntuples_v17.12/RPC340368_map.txt --bme-geo=data/map_BME_athena_extended.txt -n 10000 --do-bme

AnpBatch package for managing batch jobs

  • AnpBatch contains scripts that help with managing batch jobs at CERN and USTC
  • This package is still under development
  • It is not necessary to setup ATLAS release to use package
  • Commands to check out this package:
   $ cd ~/testarea/
   $ git clone https://:@gitlab.cern.ch:8443/ustc/AnpBatch.git

This package takes as an input a text file list containing EOS or AFS paths to input files

  • This input text file can be generated with makeFileList.py macro for grid datasets stored at CERN
  • Please first setup rucio clients using instructions here: RucioClientsHowTo
  • Then run these commands:
   $ cd testarea/AnpBatch/
   $ python macros/makeDatasetList.py data17_13TeV.*331875*DESDM_MCP* --rse=SCRATCH.* --protocol=root -o DESDM_MCP_run331875.txt

  • Input file has follow structure: file_path file_size
    • file_size is optional and can be absent
    • Example input file:
   $ cat /afs/cern.ch/user/r/rustem/public/rpc/DESDM_MCP_run331875.txt

Running batch jobs at CERN

  • CERN uses LSF software for managing batch queues
  • There are two main macros for managing jobs:
    • subCERN.py prepares shell scripts for individual jobs and to submit them to LXBATCH
    • procJob.py copies job's input files to local disk on worker node and copies output ROOT files
  • This command will simple commands for debugging purposes:
   $ source $HOME/testarea/AnpBatch/example/run_cern_test.sh
   $ source $HOME/work/batch/job-config/submit_all.sh

  • This command will run athena code to produce ntuples for RPC studies:
   $ source $HOME/testarea/AnpBatch/example/run_cern_prod.sh

  • Outstanding issues:
    • Currently acmSetup fails on batch worker nodes - need to check with Will Buttinger for advice

TODO for RPC Panel and Strip Efficiency

  • High priority tasks:
    • Make panel efficiency 1d projection overlap with 2017 runs(at least 5 combined) and 2018 runs(at least 5 combined)
    • Figure out why muons never go into some of the phi strips in BMS.
    • make efficiency comparison difference 1d plots for each sub-detector.
    • write the note for my qualification project -- on going
    • remake hit residual plots using muons with pT > 20 GeV and also separately for positive/negative muons in positive/negative eta semi-spheres
    • move eta phi combined string to z-title
    • plans of 2018
  • Low priority tasks:
    • compare the extrapolating position between ID and MS muon by muon
    • analyse unbiased muons(triggered by b jet trigger, met trigger ... ) and make comparisons between biased muon(triggered by RPC trigger) and unbiased muons
    • analyse muons passed the 20 GeV threshold; extrapolation from Inner detector and make separate website to present the different analysis
    • efficiency vs DCS status ,HV, threshold etc.
    • ZTP candidates
    • put the efficiency map to COOL
    • 2d panel efficiency map with bin size related to the real size
    • make bins in the effciency map clickable
    • give information of inefficiency panels got from ANP to Giulio (to validate the framework directly)
    • to get the extrapolating sigma
    • look at the cuts to find out the reason of the ~ 0.02 difference (0.02 due to noise is a little too large)
    • improve my macro - write a new module
    • panel_ids multiply after hadding the root files
  • Finished tasks:
    • 2018-03-29: extrapolate the muon track from Inner detector track to depress the bias in phi direction -- Rustem
    • 2018-03-15: 2d efficiency map for strip square for each panel
    • plots for each sub detector , BML, BOS, BOG and sub-sub detector, BML1, BMS2, etc.
    • list of eff_anp and eff_ref value comparison
    • make problematic panels(with large residuals) list
    • make strip efficiency map
    • Plot out of time fraction versus strip number
    • Plot strip efficiency versus strip number
    • hit residual for per hit
    • get rid of the warnings when run the ANP
    • make InTime hit residual plots for muons with cluster size 1 and panels with cluster size 2 separately -- Rustem/Heng
    • check codes for error when comparing Anp efficiency using all hits and reference efficiency
    • finish the rpc hits cluster class
    • make ROOT TTree/text file for each run with panel efficiency, error, DCS status, mean and RMS of geo residual, mean time and out of time fraction
    • make problematic panels(with large residuals) list and check if there are cables errors in these panels
    • add trigger efficiency and timing plots to our framework, include our framework into the reference one and analyse all runs in 2017 on tier0 -- Rustem

Ntuple production record

  • Ntuples and LOG location: /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_v18.01.20, PhysicsRPCProd version: v18.01.20
  • Ntuples and LOG location: /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_v18.03.12, PhysicsRPCProd version: v18.03.12
  • Ntuples and LOG location: /eos/atlas/atlascerngroupdisk/det-rpc/ntuples/ntuples_T0-prod-2018-03-29, PhysicsRPCProd version: T0-prod-2018-03-29
  • Ntuples produced by Tier0 for 2017 Runs in USTC cluster: /moose/AtlUser/liheng/NTUP_L1RPC_2017
  • Ntuples produced by Tier0 for 2018 Runs in USTC cluster: /moose/AtlUser/liheng/NTUP_L1RPC_2018
Edit | Attach | Watch | Print version | History: r47 < r46 < r45 < r44 < r43 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r47 - 2019-10-10 - RustemOspanov
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback