-- YujiYamazaki - 2018-08-21

  • eos tutorial
     Log on lxplus or lxatut machine. Type the command "eos":
     % eos
     You will be on the EOS Console.  Then the commands to check or change the directories are the same as in shell. Such as "ls", "cd", etc.
     % cd eos/atlas/atlasdatadisk/data12_8TeV/HIST
     You will see the directories that are used in the Tier0 processing. Check the HIST sample in the subdirectries.
     To log out the EOS Console. 
     % .q
     To download the file, please log out the eos, and use the command "xrdcp".
     % xrdcp root://eosatlas.cern.ch//eos/atlas/atlasdatadisk/data12_8TeV/HIST/x191_m1108/data12_8TeV.00200863.express_express.merge.HIST.x191_m1108/data12_8TeV.00200863.express_express.merge.HIST.x191_m1108._0001.1 ./

  • Castor の使い方
     nsls, rfcp => now should be using eos command
     export STAGE_SVCCLASS=atlcal をすると,permission denied が出なくなる。
     Tier0 offline DQ histogram: /castor/cern.ch/grid/atlas/tzero/prod1/perm/data10_7TeV/physics_MuonwBeam/runnumber
     CAF offline DQ histogram: 

  • Ami dataset query example


    Initialising DQ2 client on login.icepp.jp

     % gridsetup
     % voms-proxy-init -voms atlas
     % dq2setup

  • dq2-get は dataset に対してもできる

  • svn ls $SVNROOT/Trig/TrigMonitoring/TrigMuonMonitoring? とかできる

  • DQ display へのupload のしかたまとめ

     in a scratch directory:
     % mkdir data11_7TeV.00187195.express_express.merge.HIST.x142_m936
     % cd data11_7TeV.00187195.express_express.merge.HIST.x142_m936
     % export STAGE_SVCCLASS=atlcal
     % rfcp /castor/cern.ch/grid/atlas/tzero/prod1/perm/data11_7TeV/express_express/00187196/data11_7TeV.00187196.express_express.merge.HIST.x142_m936/data11_7TeV.00187196.express_express.merge.HIST.x142_m936._0001.1 .
     in your athena directory:
     % asetup,AtlasProduction,here
     % cmt co DataQuality/DataQualityConfigurations
     % cd DataQuality/DataQualityConfigurations
     % cd python
     % emacs TestDisplay.py
     -> please change the variable "mydqcfgpath" like
     mydqcfgpath = "/home/yuji/at/a/"
     % cd ../cmt
     % make
     # move all non-muonHLT han config file to backup directory
     % cd ../config
     % mkdir ../config.backup
     % mv HLT ../
     % mv ../HLT/HLTmuon ../
     % mv ./* ../config.backup
     % mv ../HLT/* ../config.backup
     % mv ../HLT . 
     % mv ../HLTmuon HLT
     # merge all han configuration file below the current directory and compile
     % merge_all_han_configs.sh
     % han-config-gen.exe collisions_run.config
     # load DQ histogram using compiled configuration file
     % ln -s ${Your_Scratch_Directory}/data11_7TeV.00187196.express_express.merge.HIST.x142_m936/data11_7TeV.00187196.express_express.merge.HIST.x142_m936._0001.1 ./data.00187196.TRMUO.root
     % DQWebDisplay.py data.00187196.TRMUO.root TestDisplay 1
     # 1 means iteration (path) 1. The filename should have the format of contain data.(runnumber).(Monitor_category(like express_express)).root
     # it may crash at the end, trying to do something on 20LB monitoring if you include other HLT monitoring. But so long as you see all the TRMUO directories on the web it is fine.

  • Grid job example for DQ jobs on lxatut

     %  asetup,AtlasProduction,here
     %  source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh
     %  pathena "--nGBPerJob=MAX" --trf "ESDtoESD_trf.py inputESDFile=%IN outputDQMonitorFile=%OUT.MonitorESD.root" --inDS data12_8TeV.00203256.physics_Muons.recon.ESD.f444 --outDS user.yuji.run203256.physics_Muons.out
     % dq2-get user.yuji.run203256.physics_Muons.out/   (the slush has to be added at the end of the dataset name)
Edit | Attach | Watch | Print version | History: r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r1 - 2018-08-21 - YujiYamazaki
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback