Pixel DQ Monitoring Softwares



PixelMonitoring - Athena reconstruction

PixelMonitoring is the monitoring scripts written in c++ based on Athena framework to make HIST file for online/offline monitoring.
The scripts can be seen from the SVN area. If you want to put the certain tag into the official reconstruction, you have to follow the Tier0 policy to confirm no crashes and no memory leaks and contact to the PROC persons.

Install and run (rel 20.7 and older)

Login to lxplus, then go to the directory that you want to install the package and to run it. At this moment, the athena release 20.1.4.7 is used here.If you want to the latest or another release, you can check the detail and status of each tag from here.

setupATLAS
asetup 20.1.4.7,here
voms-proxy-init -voms atlas
GetTfCommand.py --AMI=x362
cmt co InnerDetector/InDetMonitoring/PixelMonitoring
setupWorkArea.py
cd WorkArea/cmt
cmt bro cmt config
cmt bro gmake -j 4
cd -
Reco_tf.py --conditionsTag all:CONDBR2-ES1PA-2015-05 --ignorePatterns='ToolSvc.InDetSCTRodDecoder.+ERROR.+Unknown.+offlineId.+for.+OnlineId' --ignoreErrors 'True' --autoConfiguration='everything' --maxEvents '-1' --AMITag 'x340' --postExec 'r2e:topSequence.LArNoisyROAlg.Tool.BadChanPerFEB=30' --preExec 'all:DQMonFlags.enableLumiAccess=False;DQMonFlags.doCTPMon=False;from MuonRecExample.MuonRecFlags import muonRecFlags;muonRecFlags.useLooseErrorTuning.set_Value_and_Lock(True);InDetFlags.useBeamConstraint.set_Value_and_Lock(False);' --geometryVersion all:ATLAS-R2-2015-03-01-00 --beamType 'collisions' --inputBSFile root://eosatlas.cern.ch//eos/atlas/atlastier0/rucio/data15_13TeV/express_express/00284484/data15_13TeV.00284484.express_express.merge.RAW/data15_13TeV.00284484.express_express.merge.RAW._lb0400._SFO-ALL._0001.1 --outputHISTFile myHIST.root --maxEvents=1

With 20.7 or older, you may face to crash or segmentation fault of ./runwrapper.RDOtoRDOTrigger.sh. In the case, it is recommended to use rel21 instead.

Install and run (rel 21 and newer)

setupATLAS
asetup AtlasOffline 21.0.X rel_4 here
svnco PixelMonitoring
cmake .
make -j 4
source ./x86_64-slc6-gcc49-opt/setup.sh
Reco_tf.py --conditionsTag all:CONDBR2-ES1PA-2015-05 --ignorePatterns='ToolSvc.InDetSCTRodDecoder.+ERROR.+Unknown.+offlineId.+for.+OnlineId' --ignoreErrors 'True' --autoConfiguration='everything' --maxEvents '-1' --AMITag 'x340' --postExec 'r2e:topSequence.LArNoisyROAlg.Tool.BadChanPerFEB=30' --preExec 'all:DQMonFlags.enableLumiAccess=False;DQMonFlags.doCTPMon=False;from MuonRecExample.MuonRecFlags import muonRecFlags;muonRecFlags.useLooseErrorTuning.set_Value_and_Lock(True);InDetFlags.useBeamConstraint.set_Value_and_Lock(False);' --geometryVersion all:ATLAS-R2-2015-03-01-00 --beamType 'collisions' --inputBSFile root://eosatlas.cern.ch//eos/atlas/atlastier0/rucio/data15_13TeV/express_express/00284484/data15_13TeV.00284484.express_express.merge.RAW/data15_13TeV.00284484.express_express.merge.RAW._lb0400._SFO-ALL._0001.1 --outputHISTFile myHIST.root --maxEvents=1

HIST Pixel Only

add option like below
    --preExec 'all:DQMonFlags.doCTPMon=False;DQMonFlags.doLVL1CaloMon=False;DQMonFlags.doHLTMon=False;DQMonFlags.doTRTMon=False;DQMonFlags.doMissingEtMon=False;DQMonFlags.doMuonTrackMon=False;DQMonFlags.doMuonSegmentMon=False;DQMonFlags.doMuonTrkPhysMon=False;DQMonFlags.doMuonCombinedMon=False;DQMonFlags.doLucidMon=False;DQMonFlags.doJetTagMon=False;DQMonFlags.doEgammaMon=False;DQMonFlags.doMuonRawMon=False;DQMonFlags.doTRTElectronMon=False;DQMonFlags.doLArMon=False;DQMonFlags.doTileMon=False;DQMonFlags.doCaloMon=False;DQMonFlags.doPixelMon=True;DQMonFlags.doGlobalMon=False;DQMonFlags.doInDetAlignMon=False;DQMonFlags.doInDetGlobalMon=False;rec.doFwdRegion=False;rec.doTau=False;rec.doMuon=False;rec.doEgamma=False;rec.doMuonCombined=False;rec.doCalo=False;rec.doJetMissingETTag=False;rec.doTrigger=False’ 

Check before uploading

You have to check if it runs without any problems, following Tier0 policy. What you have to do is as following:

Reco_tf.py --AMI q220
Reco_tf.py --AMI q221
Reco_tf.py --AMI q222
Reco_tf.py --AMI q223
Reco_tf.py --AMI q431

If it crashes, you have to modify your scripts until it does not crash. After these tests, you have to check the memory leaks as following:

localSetupPyAMI
RunTier0Tests.py

The RunTier0Tests.py is the script to check the memory leaks. The detail is written in here. If it fails, you have to modify your scripts until it succeeds.

Put into official release

Note that a tag can be added if:

  1. It fixes a serious problem in the offline software, and
  2. It compiles against the last nightly build of the release it is requested for, and
  3. It does not break any of the four q220-q223, q431 tests, and
  4. It does not have modified public header files (if it does you must investigate and list any client packages that need to be recompiled1), and
  5. The request includes a link to the SVN diff in trac (wrt the current tag in the nightly)
How to get the SVN diff link to trac:
  • Navigate to the new tag in trac
  • Click "View changes...", put in current tag in "From:" field
  • Hit "View changes"
Finally, you have to explain to PROC persons in your request what serious problem we would have to live with, should your tag not be included.

Remaining issues

https://its.cern.ch/jira/browse/ATLASRECTS-3439 https://its.cern.ch/jira/browse/ATLASRECTS-3064 https://its.cern.ch/jira/browse/ATLASRECTS-3348 https://its.cern.ch/jira/browse/ATLASRECTS-3330

coverity report

31737 28/10/2015 (High) Resource leak in object :/InnerDetector/InDetMonitoring/PixelMonitoring/PixelMonitoring/PixelMainMon.h in function "moduleDcsDataHolder" 109022 27/04/2016 (Medium) Inferred misuse of enum :/InnerDetector/InDetMonitoring/PixelMonitoring/src/Clusters.cxx in function "BookClustersMon" 105913 08/02/2016 (Medium) Identical code for different branches :/InnerDetector/InDetMonitoring/PixelMonitoring/src/Errors.cxx in function "BookRODErrorMon" 16972 09/07/2014 (Medium) Uninitialized pointer field :/InnerDetector/InDetMonitoring/PixelMonitoring/src/PixelMainMon.cxx in function "PixelMainMon" 109021 27/04/2016 (Medium) Copy-paste error :/InnerDetector/InDetMonitoring/PixelMonitoring/src/Track.cxx in function "FillTrackMon" 111555 23/08/2016 (Medium) Dereference after null check :/InnerDetector/InDetMonitoring/PixelMonitoring/src/Track.cxx in function "FillTrackMon"

PixelPostProcess - Athena reconstruction (DQHistogramMerge)

PostProcess is the scripts written in c++ based on Athena framework to add new histograms into the HIST file for offline monitoring.You can normalize the histograms by the number of total events and merge the histograms. The SVN area is here.

Setup

setupATLAS
asetup 20.1.4.7,here
setupWorkArea.py
cmt co DataQuality/DataQualityUtils
cd WorkArea/cmt
cmt bro cmt config
cmt bro gmake clean
cmt bro gmake -j 4
cd -

Run

You need to make a text file which contains the path of the input HIST files.

For example:

cat input.txt
/afs/cern.ch/user/d/dyamaguc/public/data15_13TeV.00280500.express_express.merge.HIST.f631_h81._0001.1

Then, you can run the PostProcess:

cd WorkArea/run
DQHistogramMerge.py input.txt output.root 1

Request to official release

To request the latest tag into official release, you send am email to Peter Onyisi <ponyisi@utexas.edu> and Iurii Ilchenko <Yuriy.Ilchenko@cern.ch>.

WebDisplay

The WebDisplay is the web interface to see the histograms made in Tier0 (Athena) reconstruction. The website is here. This is used for Data Quality activities. Procedure to update the configures is written in here.

Setup

setupATLAS
asetup 20.1.4.7,here
cmt co DataQuality/DataQualityConfigurations
cd DataQuality/DataQualityConfigurations/cmt
cmt make
cd ../config

Edit

The configuration files for Pixel detectors is Pixel/***.config.

cd Pixel
han-config-check.sh collisions_run.config
cd ../
merge_some_han_configs.sh Pixel
merge_all_han_configs.sh
han-config-gen.exe collisions_run.config
han.exe cosmics_run.hcfg data15_cos.00259921.physics_IDCosmic.merge.HIST.f567_h16._0001.1 run_259921
DQWebDisplay.py data15_cos.00259921.physics_IDCosmic.merge.HIST.f567_h16._0001.1 TestDisplay 1

See the web display for test

You can see the output from here.

SummaryScript

aaa

PixelCalibAlg

https://twiki.cern.ch/twiki/bin/viewauth/AtlasComputing/RecExCommonAutoConfiguration

https://svnweb.cern.ch/trac/atlasoff/browser/InnerDetector/InDetExample/InDetRecExample?rev=38374&order=name

https://svnweb.cern.ch/trac/atlasoff/browser/InnerDetector/InDetConditions/PixelCoralClientUtils/tags/PixelCoralClientUtils-00-05-06/PixelCoralClientUtils/SpecialPixelMap.hh?rev=654935

https://svnweb.cern.ch/trac/atlasoff/browser/InnerDetector/InDetConditions/PixelConditionsServices/tags/PixelConditionsServices-00-24-12/PixelConditionsServices/ISpecialPixelMapSvc.h?rev=654935

Database file

#!/bin/env python

# Taken from InnerDetector /InDetRecTools/TRT_ElectronPidTools/DatabaseTools/WritePyCoolAll.py

import sys from PyCool import cool

def main(): dbFile = "pixeldead.db" dbName = "OFLP200" # folderName ="/PIXEL/DCS/HV" folderName ="/PIXEL/PixMapOverlay" tag = "PixMapOverlay-SIM-RUN12-000-07"

fieldNames = ["moduleID","ModuleSpecialPixelMap_Clob"] # fieldTypes = [cool.StorageType.Float] fieldTypes = [cool.StorageType.Int32, cool.StorageType.String4k]

# remove the old db file so that we can write the new one try: import os os.remove(dbFile) except: pass

# get database service and open database dbSvc = cool.DatabaseSvcFactory.databaseService()

# database accessed via physical name dbString = "sqlite://;schema=%s;dbname=%s" % (dbFile, dbName) try: db = dbSvc.createDatabase(dbString) except Exception, e: print 'Problem creating database', e sys.exit(-1) print "Created database", dbString

# setup folder spec = cool.RecordSpecification() spec.extend(fieldNames[0], fieldTypes[0]) spec.extend(fieldNames[1], fieldTypes[1])

# folder meta-data - note for Athena this has a special meaning desc = '<timeStamp>run-lumi</timeStamp><addrHeader><address_header service_type="71" clid="1238547719" /></addrHeader><typeName> CondAttrListCollection </typeName>'

# create the folder - multiversion version # last argument is createParents - if true, automatically creates parent folders if needed # note this will not work if the database already exists - delete mycool.db first folderSpec = cool.FolderSpecification(cool.FolderVersioning.MULTI_VERSION, spec) folder = db.createFolder(folderName, folderSpec, desc, True)

# now fill in some data - create a record and fill it data = cool.Record(spec)

# # get a reference to the blob # blob = data[fieldNames[3]] # # set the size (in bytes) # blob.resize(len(electronBlob)) # print "Length of Blob is", len(electronBlob) # for i in xrange(0, len(electronBlob)): # blob[i] = electronBlob[i]

# print "Will store this object for channel", channel, data # folder.storeObject(0, cool.ValidityKeyMax, data, channel, tag) # val=cool.ValidityKey() # val. for channel in range(0, 2048): data[fieldNames[0]] = channel data[fieldNames[1]] = "0 0" folder.storeObject(0 , cool.ValidityKeyMax, data, channel, tag)

db.closeDatabase()

if __name__=="__main__": main()

-- DaikiYamaguchi - 2016-01-04

Topic attachments
I Attachment History Action Size Date Who Comment
Microsoft Word filertf jobOption.rtf r1 manage 21.1 K 2016-08-02 - 17:27 DaikiYamaguchi job option file for PixelMonitoring
Edit | Attach | Watch | Print version | History: r14 < r13 < r12 < r11 < r10 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r14 - 2017-04-25 - EunchongKim
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback