How to Skim/Slim a D3PD

The following is based on the Twiki:

1. Prepare the workarea and checkout the files

mkdir -p $HOME/SKIM
asetup AtlasPhysics,,here
cmt co -r NTUPtoNTUPCore-00-00-06 PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPCore
cmt co -r NTUPtoNTUPExample-00-00-07 PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample
2. Copy the skimming python code

Copy the file in the directory:

3. Compile
cd $HOME/SKIM/PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPCore/cmt
cd $HOME/SKIM/PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/cmt

Have a look in the file to see which are the variables that will be kept and what is the filtering at the lepton level: at least 2 leptons above 5 GeV pt

4. Run some tests

LOCALY TEST with: inputNTUP_SMWZFile=/afs/ outputNTUP_MYSKIMNTUPFile=myTestNtup.root

Note that the trf calls NTUPtoNTUPCore/share/ which in turn calls NTUPtoNTUPExample/share/MySkimNTUP_prodJobOFragment


source /afs/

pathena --trf " inputNTUP_SMWZFile=%IN outputNTUP_MYSKIMNTUPFile=%OUT.mySkimNtup.root" --inDS=mc12_8TeV.129477.PowhegPythia8_AU2CT10_WZ_Wm11Z11_mll0p250d0_2LeptonFilter5.merge.NTUP_SMWZ.e1300_s1469_s1470_r3542_r3549_p1328/ --outDS=user.kbachas.mc12_8TeV.129477.PowhegPythia8_AU2CT10_WZ_Wm11Z11_mll0p250d0.e1300_s1469_s1470_r3542_r3549_p1328/ --nFiles=2 --nFilesPerJob=1

Moving/sharing your SFrame setup

If you want to copy your sframe setup to share it with your colleagues and make it work, follow these instructions. It is useful in order to avoid repeating numerous hacks in AnalysisBase and other inconsistencies with the svn trunk version. You can just create a tarball with your testarea/ElectroweakBosons folder and share this file.

1. Copy tarball and decompress:

   tar -xzvf sframe.tgz
   cd ElectroweakBosons

2. Remove recursively .svn folders, optionally remove also .__afs folders (they are left-overs from afs client, sometimes big in size)

   find . -name .svn -exec rm -rf {} +
   find . -name .__afs* -exec rm -rf {} +

3. Setup environment

   source scripts/

4. Clean-up

        cd RootCore/RootCore
        cd ../..
        source scripts/
        cd RootCore
        cd ..
        make clean
        make distclean

5. Compile everything


How to run Proof-on-Demand (PoD)

To install pod (only first time):

tar -xzvf PoD-3.10-Source.tar.gz
source scripts/  (from Sframe)
cd PoD-3.10-Source
mkdir build
cd build
cmake ../BuildSetup.cmake ..
make -j4 install
source scripts/ (from Sframe)

Change in $HOME/.PoD/PoD.cfg:


Create $HOME/.PoD/

echo "Setting user environment for workers ..."
source /afs/
export LD_LIBRARY_PATH=/afs/

To run PoD:

source scripts/
source scripts/ 
pod-server start

In your xml config file, use:


Create a a worker cluster on lxbatch, using N cores, and check if they are ready to use:

pod-submit -r lsf -q 1nd -n N
pod-info -n

Now submit your job with 'sframe_main ...'.

Avoid compilation on PoD nodes NEW

It is EXTREMELY time-saving to manipulate a handful of scripts in order to avoid compilation on each and every one of the worker nodes.

The procedure is described in the presentation by Max Bellomo, named tutorial.pdf (see attachments)

Tip: After making all the changes described above, under your main SFrame directory, issue:

make clean; make distclean; make; cd AnalysisXY; make clean; make distclean; make;

and then restart the PoD server (if it was running) in order to propagate all changes imposed.

How to run SFrame on the GRID NEW

Update your 'grid/' folder to this: in order to use the existing scripts, (to build the job on the grid), (to run sframe on the grid), (to submit the job). Follow the next steps.

1) Create a configuration file for your grid job, say grid_analysiszz.xml, by merging your standard analysis configuration file and CycleConfig.xml (see attached file as an example). You should remove unnecessary entities and make sure these lines are present in your merged file:

<!ENTITY grid SYSTEM "input_grid.xml">
(This file, input_grid.xml, is created automatically when running on the grid and will include the correct file for each subjob.). Make sure that all configuration files are in your ElectroweakBosons path and are defined properly in your configuration file. For example, these will work on the grid:
<Item Name="JetAFIICalibconfigFile"  Value=":exp:$EWPATH/RootCore/ApplyJetCalibration/data/CalibrationConfigs/Rel17_JES_AFII.config"/>
<Item Name="JESconfigFile"           Value="JES_2012/Moriond2013/InsituJES2012_20NP_ByCategory.config"/>
while this will NOT work:
<Item Name="PileupDataFileName"      Value="/afs/"/>
Therefore, you should create a folder in your ElectroweakBosons path, say ExtRootFiles, and copy all needed files there. Make sure you include these files when submitting your job with (see below).

2) In AND, remove any gcc or root configuration and add at the beginning:

localSetupROOT --skipConfirm

In, add a statement to call your new configuration file:

    elif [ "$3" == "AnalysisZZ" ] ; then
   echo "Doing ZZ Analysis ..."
   sframe_main grid/grid_analysiszz.xml 

3) Edit to submit your job to the grid:

VERSION="yourVersion"   ---> Just a string to be inserted in the output container name
CONTAINER=""   ---> Add "/" if you are running on a dataset container (the usual case)
RELEASE= ---> Doesn't matter, will be configured later
CMTCONFIG=x86_64-slc5-gcc43-opt  ---> Same here...
# "kbachas"                                                                                                                                                                                               
# "mbellomo"                                                                                                                                                                                              
# "vkousk"                                                                                                                                                                                                
# "mschott"                                                                                                                                                                                               
"iliadis"           ---> Comment-out all other users and add your Grid name here

"finalTest.txt"   ---> Define here the input: Inside directory grid, place the relevant .txt file with the datasets to run on.
prun --exec "./grid/ %IN $datamctag $SKIM" \
--rootVer=5.34/11 --cmtConfig=x86_64-slc5-gcc43-opt \
--inDS="${samples[$i]}${CONTAINER}" --nFiles 1 \
--outDS="user.${PUSER}.${outsample}" \
--outputs="AnalysisManager.data12_8TeV.ZZ.root" \
--nFilesPerJob 2 --mergeOutput \
--excludeFile=\*/obj,\*/src/\*Dict\*,\*/lib,RootCore/\*/StandAlone/\*,RootCore/\*/obj/\*,RootCore/\*/bin/\*,RootCore/RootCore/scripts/,RootCore/RootCore/lib/\*,RootCore/RootCore/include/\*,RootCore/RootCore/python/\*,AnalysisZmumu,AnalysisWmunu,AnalysisWW,AnalysisWZ,AnalysisHWW,AnalysisWZorHbb,AnalysisWjets,doc,AnalysisZZ/config/mc12_p1328\*,AnalysisZZ/config/data12_p1328\*,patches,PoD\* \
--extFile=RootCore/RootCore.par,RootCore/MuonEfficiencyCorrections/share/*,RootCore/MuonMomentumCorrections/share/*,\*.txt,ExtRootFiles,ExtRootFiles/\*,ExtRootFiles/pileup/\*.root,ExtRootFiles/muontrigger/\*.root,ExtRootFiles/electronSFcorrections/\*.root,RootCore/\*/\*/*.root \

For a successfull grid job, make sure in the prun command that:

  • the output of your analysis code has the same name as declared in --outputs
  • the --excludeFile option does not contain any files that are needed by your job (but rule out all the files non-relevant to your analysis so that the submission tarball is lighter)
  • the --extFile option contains all files that are needed by your job (see also relative comments above).

4) Submit the job from ElectroweakBosons:

source ./scripts/
source ./grid/

NOTE: Use --nFiles=1 to run a test on a single file; remove it to run on the full sample.

How to install and run POWHEG-BOX/ZZ

Setup Athena:


Download trunk of POWHEG-BOX, optionally remove unwanted sub-packages to reduce the size of the BOX:

svn co --username anonymous --password anonymous svn://
rm -rf Dijet hvq gg_* tt* ST_* HJ* VBF_*  Zj* Z2jet Z_ew-BMNNPV W*

Change directory:


Edit Makefile, find and setup the following variables:


Compile and run:

make pwhg_main
cp -r test test1
cd test1

The input options are given in powheg.input (find an example inside the test directory), and the output is a Les Houches Event file, "pwgevents.lhe". TIP: In powheg.input, setup the PDF for the colliding protons to set of choice. For CT10 use:

lhans1  10800
lhans2  10800

Also, set the colliding beam energies to 4000 GeV (default is 3500 GeV). Factorization and renormalization scales are also set in this file.

To shower events with Pythia in Athena, do the following. Prepare the Les Houches file for Athena scripts:

cp pwgevents.lhe
tar -czvf user.inomidis.Powheg_CT10.126938.000001.pwgevents._1.tar.gz

Setup this variable and get the job-options (this hack is needed for 17.2.X.Y, usual scripts don't work):

cp /cvmfs/ .

Run transformation script: ecmEnergy=8000 runNumber=126938 firstEvent=1 maxEvents=-1 randomSeed=1234 outputEVNTFile=Powheg.pool.root inputGenerat
orFile=user.inomidis.Powheg_CT10.126938.000001.pwgevents._1.tar.gz postExec='ServiceMgr.MessageSvc.enableSuppression=False'

The output is Powheg.pool.root which contains the McEventCollection GEN_EVENT (same as GEN_AOD). Analyze with Athena classes, or convert to D3PD and analyze with ROOT.

How to install MCFM

MCFM is a program designed to calculate the cross sections of various processes at hadron-hadron colliders. Its documentation can be found here. MCFM depends on two external programs: CERNLIB and LHAPDF.

The following guide is going to use $HOME/.local for the installation of the necessary libraries and of the PDF sets. Add the following lines to your shell:

export PATH="$PATH:$HOME/.local/bin"


Restart your shell and create the directory $HOME/.local .

How to install CernLib

CERNLIB is a (deprecated) collection of libraries and modules offered by CERN's central computers.

In order to install CernLib, change directory to $HOME/.local and retrieve the necessary files:

Extracting all three of them will create a directory named 2006b. After their extraction, the gziped files can be safely deleted.

How to install LHAPDF

LHAPDF provides a unified and easy to use interface to modern PDF sets. It is designed to work not only with individual PDF sets but also with the more recent multiple "error" sets.

Its installation will take place in the same directory.

If the installation occurs without errors, LHAPDF is ready to be used. You can use the command lhapdf-getdata to download PDF sets. You either save them under a folder
PDFsets in the $HOME/.local folder and then symbolic link to MCFM/Bin/PDFsets, or you can download them directly in MCFM/Bin/PDFsets :
  • mkdir $HOME/.local/PDF_sets && cd !$
  • lhapdf-getdata CT10

Install MCFM

It is now time to install MCFM. First, download and extract the source:

Now run ./Install . Once it is finished, enter the makefile and edit the CERNLIB and PDFLIB as such

CERNLIB = /afs/




Save & exit the file and compile:

  • make

EWUnfolding NEW

This section describes how to get the EWUnfolding code from SVN, compile it and use it. It has been tested on lxplus5, but not 6.

Check out code:

svn co svn+ssh:// EWUnfolding/
(branch-00-00-01 allows histogram input instead of branch input in the Unfolding code, reducing drastically SFrame output root file size)

Compile it:

In the line below, gcc34 IS NOT A TYPO! All the symlinks under /afs/ point to gcc34!

# gcc 4.3.5
source /afs/

# ROOT 5.34.03
cd /afs/
source bin/
cd ~

cd EWUnfolding/branches/EWUnfolding-00-00-01-branch/Code/

# Compile RooUnfold:
cd RooUnfold

# Set up RootCore:
cd external/RootCore

# Compile BootstrapGenerator using RootCore:
cd ..
source RootCore/scripts/
RootCore/scripts/  #do not source
RootCore/scripts/        #do not source

# Compile EWUnfold:
cd ..

source external/RootCore/scripts/

(it would be handy to copy-paste all of the above and put in in a script, i.e "")

To run, simply issue:

./EWUnfoldBase config/unfolding_steering_file.xml

Documentation: The ATLAS style, nearly none! There are two talks, one by Matthias Schott:

and one by Adrian Lewis:

Tips'n'Tricks NEW

Sane lxplus usage

We're supposed to be submitting heavy cpu/memory jobs to the lxbatch system and NOT on the lxplus. However, sometimes it's virtually impossible to avoid running a job that brings an lxplus node to it's knees.. In order to avoid warnings from the IT department, use the

command, right before executing your command. For instance, for an SFrame job:

nice -n 10 sframe_main config/your_steering_file.xml

This will change your "niceness" from 0 (default value) to 10. The niceness of a user varies from -20(most favorable) up to 19(least favorable) and it works as an "advisor" to the scheduler running on lxplus. Only a root user can lower niceness.

PoD - How many workers?

More workers = less running time but also more merging time. So a balance is needed, what you gain in running time you loose it at merging. Personal experience: 60-80 workers are more than enough!

Analysis on lepton ntuples


Setup ZZ->4l Analysis code

For SM ZZ analysis, see

Contact: Vasiliki Kouskoura , Ioannis Nomidis.

For Higgs/inclusive four-leptons analysis, see

Contact: Dinos Bachas , Ioannis Nomidis.

Matrix-element weights with MadWeight NEW

Running the MadWeight tutorial

Useful links

Topic attachments
I Attachment History Action Size Date Who Comment
Texttxt HowTo_MCFM.txt r1 manage 4.8 K 2013-08-13 - 11:58 VasilikiKouskoura HowTo for MCFM (details for ZZ)
XMLxml grid_analysiszjpsi.xml r1 manage 4.4 K 2013-10-15 - 21:37 DimitrisIliadis  
XMLxml grid_analysiszz.xml r1 manage 9.3 K 2014-01-29 - 01:40 IoannisNomidis  
Unix shell scriptsh r1 manage 8.1 K 2013-10-15 - 21:37 DimitrisIliadis  
Texttxt r1 manage 23.3 K 2013-03-21 - 21:30 KonstantinosBachas  
PDFpdf tutorial.pdf r1 manage 457.7 K 2013-10-14 - 21:25 DimitrisIliadis How to avoid PoD compilation (slides 28-33)
Edit | Attach | Watch | Print version | History: r23 < r22 < r21 < r20 < r19 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r23 - 2014-10-06 - NicolaOrlando
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback