Details on how to run the UCHbb analysis code.

For a list of availible dataset see UCDataSets

To Setup Code

Following instructions meant for the UCT3 but they should be applicable with minor modification to other systems

  • 1) log into the T3.
  • 2) create a clean working directory, go there.
  • 3) copy the setup script:

svn cat svn+ssh://svn.cern.ch/reps/atlasinst/Institutes/UChicago/UCHbb/UCProd/trunk/scripts/setupUCProd.sh > setupUCProd.sh

  • 4) run it:

source setupUCProd.sh

This will ask you to authenticate using kinit and check out the UCProd package. (If your cern username is different that your uct3 username, run source setupUCProd.sh [username] )

  • * 5) You can then setup the trunk with:

    cd ../
    mkdir UCHbb
    cp UCProd/scripts/setupCodeTrunk.sh UCHbb
    cd UCHbb
    source setupCodeTrunk.sh
    

This will check out and compile ProofAna, UCHbbCore and the necessary utils. From here, separate ProofAna packages can be checked out from the ProofAna directory using svn

Clearing Datasets

ProofAna caches info about the datasets, so if the locations change or the files are different, you'll need to clear datasets and rerun, To do this use:
root -l -b -q '../scripts/clearDatasets.C("lite://","mc12_8TeV.182298.MadGraphPythia8_AU2CTEQ6L1_RSG_hh_bbbb_m600.AFII.p1575.COMMON")'

Make .txt input files from the grid dataset name

After the skimmed physics are transferred to the T2 we need to create input .txt files so we can run over them. To do this we need to know the file paths on the T2. makeUCFileList.py is a script that takes the grid dataset and output a .txt file with a list of the input files needed. To run do:
python ../scripts/makeUCFileList.py -d  skimmed.data.set.name/ -o outputFile.txt 
for example:
python ../scripts/makeUCFileList.py -d  user.johnda.Nov2013.mc12_8TeV.182298.MadGraphPythia8_AU2CTEQ6L1_RSG_hh_bbbb_m600.AFII.p1575.COMMON/ -o ../filelists/mc12_8TeV.182298.MadGraphPythia8_AU2CTEQ6L1_RSG_hh_bbbb_m600.AFII.p1575.COMMON.txt

for files on fax use:

python ../scripts/makeUCFileList.py --mode fax -d user.johnda.May6.mc12_8TeV.182298.MadGraphPythia8_AU2CTEQ6L1_RSG_hh_bbbb_m600.AFII.p1575.COMMON/ -o ../filelists/mc12_8TeV.182298.MadGraphPythia8_AU2CTEQ6L1_RSG_hh_bbbb_m600.AFII.p1575.COMMON_new.txt 

To Run a Single Skimming Job on the Grid

  • Make sure you have the latest version of UCHbbCore or an appropriate tag
  • Create (and commit) a config file in UCHbbCore/filelists that points to the grid dataset name you want to run on. This config should have at least two lines: TREENAME and name_XXXX, where XXXX is usually the MC number or the data period and stream.

eg: TREENAME=physics name_161827=user.johnda.mc12_8TeV.161827.Pythia8_AU2CTEQ6L1_ZH125_llbb.merge.AOD.e1812_a188_a171_r3549_tid01204039_00.v2013_10_12_14_48_02/

(NOTE: Its important that this file end with a newline.)

  • Make the ProofAna Packages using "make root" instead of the normal "make" This includes needed .root files in the tarball that will be sent to the grid worker nodes.

  • Launch the Jobs using the "grid" option:     =root -q -b 'runSkim.C("grid","SomeIdentifier","DataSetName")' The first argument says to use the grid. The second is a name that will be included in the output file to identify the production. The third is the name of the config file (minus .config) that was created above.

You also might need to remove the following lines if you do not have write access to MWT2: --destSE=MWT2_UC_LOCALGROUPDISK

To run on condor

Set up user permissions to access T3/T2 disks source ../scripts/setupCondorEnv.sh.

If your first time running ProofAna with condor, make the following directories to hold the condor outputs:

mkdir ProofAna/run/log

mkdir ProofAna/condor

You'll also need to change the line +AccountingGroup = "group_uct3.johnda" in the file scripts/proofAna.condor to your username. (You might also need to remove $ENV(X509_USER_PROXY) from the line :

Transfer_Input_Files=../condor/proofana-condor.tar, $ENV(X509_USER_PROXY), $ENV(JOBTAR)

Run the ProofAna package using the "condor" option. root -q -b 'runSkim.C("condor","Oct17","UnskimmedNTUP_COMMON_WH125_lnubb.COMMON")'

General Info

External Packages via RootCore -- RootCore is a tool to make managing required ATLAS packages easier. This tool has to be only called for new checkouts of UCHbbCore to fetch the required packages, or anytime new dependencies are added.

More information: https://twiki.cern.ch/twiki/bin/viewauth/AtlasComputing/RootCore

The following commands should be run from inside ProofAna directory.

Instructions for setting up a production

Before defining a production its best to tag versions of ProofAna and UCHbbCore.

* Define a production python file in the production/ directory of UCProd. Add and commit this to svn.

then something like python scripts/runProd.py --prodName [Production Name] --sampleList [Comma,Separated,List,Of,Sample,Names] should setup and compile and make a submissinon script that you can just source the last time I did this was Feb so I assume you wnat ot make a new UCHbbCore tag but hopefully thats all that needs to be done

Instructions for launching a pre-defined production

  • 1) log into the T3.
  • 2) create a clean directory, go there.
  • 3) copy the setup script: > cp /home/johnda/Analysis/UCProd/scripts/setupUCProd.sh .
  • 4) run it: > source setupUCProd.sh (This will ask you to authenticate using kinit, If your cern username is different that your uct3 username let me know)
  • 5) launch the production > python scripts/runProd.py   --prodName Nov2013   --sampleList String,Given,Below  --submit

Example from last production:

For the first round lets just do the JetMetStream and a few MC samples. We can extend this to the Electron and Muon streams later.

Its probably best if we work out the kinks with the "super users" and then extend the production to everyone for the next iteration.

Ive broken up the input datasets into groups of roughly similar input sizes.

Lauren: JetMetPeriodB_p1562,TTbar_Lep_McAtNlo.p1575 Yasu: JetMetPeriodC_p1562,JetMetPeriodD_p1562 David: JetMetPeriodE_p1562,JetMetPeriodG_p1562 Karrol JetMetPeriodH_p1562,JetMetPeriodI_p1562 John: JetMetPeriodJ_p1562,JetMetPeriodL_p1562 Zihao RSG_hh_bbbb_AFII_p1575,TTbar_Lep_PowhegPythia.p1575 ,TTbar_MT1725_allHad_PowhegPythia.p1575

So for instance Zihao would run steps 1) - 4) and then execute:

> python scripts/runProd.py --prodName Nov2013 --sampleList RSG_hh_bbbb_AFII_p1575,TTbar_Lep_PowhegPythia.p1575 ,TTbar_MT1725_allHad_PowhegPythia.p1575 --submit

I think this should all go fairly smoothly, but there are bound to be problems I haven't forseen. Let me know if you have questions or run into problems with the setup above.

As always, feel free to criticize (or improve) the submission setup.

-- JohnAlison - 11 Dec 2013

Edit | Attach | Watch | Print version | History: r10 < r9 < r8 < r7 < r6 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r10 - 2014-07-24 - JohnAlison
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback