Detector related effects:

Analysis Stuff

There are two types of corrections available: pT leakage corrections and pileup corrections. It's recommended to apply both types.

For the pileup corrections, there are currently two correction methods available:

- average correction based on the number of primary vertices (nPV) in the event - event-by-event correction based on the ambient energy density (ED)

You'd apply only one OR the other of these, not both. The first of these is going to be used by most electron analyses, while the second one has been used in most photon analyses already for some time. So I'd generally recommend the nPV correction for electrons and the ED correction for photons.

To apply the leakage and nPV corrections (electrons), you'd use the method: GetPtNPVCorrectedIsolation()

To apply the leakage and ED corrections (photons), you'd use the method: GetPtEDCorrectedIsolation()


Tag meaning:

MC : MC10a = r2215_r2260_p545/ - MC10b = r2302_r2300_p574/ DATA: Previous processings still available under tag p503 and p580. They will be obsoleted once all 2011 data are available with p591.

A brief summary of reco- and digi- tags potentially relevant for analysis using MC10 samples is:

tag pile up configuration default sample fraction
_r1652 no pile up 10% of total sample statistics
_r1659 in-time pile up $ \langle n_{MB} \rangle =2, 900\mathrm{ns}$ 20% of total sample statistics
_r1831 bunch train pileup configuration 100% of total sample statistics
Sample with bunch train pileup configuration (_r1831) is presumably most interesting for analysis and the D3PDs are produced with this tag by default.

The first round of MC10 AOD merging has the tag: _r1700. In this tag high-pt pile up events are present at potentially observable rates as docmented at the TopRecoBugs wiki pages.

The currenly running round of MC10 AOD merging has the tag _r2040. In this tag the problematic high-pt pile up events mentioned above are filtered out .

There are two options how to avoid using events with the problematic high-pt pile up events overlayed in your analysis:

  • use tag _r1700 + Wouter's tool to remove the problematic events,
  • upgrade to tag _r2040 where problematic events were filtered out during central production already.
The official D3PDs were produced using _r1700 and will be updated to _r2040 for the next production round end of Feb.

For the details and potential updates please consult the AtlasProduction Group TWiki pages.

variables meaning:

  • EtconeXX : this is the standard variable with no corrections applied
  • EtconeXX_pt_corrected: this variable has the leakage correction applied.
  • EtconeXX_ED_corrected : this variable has the ambient energy density pileup corrections applied.
  • EtconeXX_corrected : this variable has both the leakage and ambient energy density pileup corrections applied.

physics effects:

Samples (find samples here)

D3PD content


Analysis project:


  • 2011:


2011 5fb-1 analysis

bining stuff



Data Sample

  • user.ftian.SKIMED2.data11_7TeV.periodD.NoGRL.physics_Egamma.NTUP_SMWZ.p605/
  • user.ftian.SKIMED2.data11_7TeV.periodE.NoGRL.physics_Egamma.NTUP_SMWZ.p605/
  • user.ftian.SKIMED2.data11_7TeV.periodF.NoGRL.physics_Egamma.NTUP_SMWZ.p605/
  • user.ftian.SKIMED2.data11_7TeV.periodG.NoGRL.physics_Egamma.NTUP_SMWZ.p605/
  • user.ftian.SKIMED2.data11_7TeV.periodH.NoGRL.physics_Egamma.NTUP_SMWZ.p605/


Lumi Calculation:


Energy Scale correction

For 2011 data: Energy scale corrections derived from 2010 data have been applied at cell level in 16.6 reconstruction release. Until further notice, NO correction should be applied to 2011 data by default.


Generators for wgamma

  • Madgraph:
    • Use MG_ME_V4.5.0
    • Madgraph provides LO cross section
    • Madgraph does NOT have fragmentation photon
  • Baur Wgamma NLO:
    • Baur Wgamma NLO provides LO and NLO cross sections
    • Baur Wgamma NLO has fragmentation photon
    • Baur Wgamma NLO does NOT have FSR photon
  • MCFM:
    • Use MCFM 6.0
    • MCFM provides LO and NLO cross sections
    • MCFM has fragmentation photon

MC generators for ALTAS:


MC Sample

  • BKG list on xrootd(old):
    • wtaunu: mc10_7TeV.107700.AlpgenJimmyWtaunuNp0_pt20.merge.NTUP_SMWZ.e600_s933_s946_r2302_r2300_p574/
    • ttbar:mc10_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_SMWZ.e598_s933_s946_r2302_r2300_p574/
    • zee:mc10_7TeV.107650.AlpgenJimmyZeeNp0_pt20.merge.NTUP_SMWZ.e737_s933_s946_r2302_r2300_p574/
    • w+jet:mc10_7TeV.107681.AlpgenJimmyWenuNp1_pt20.merge.NTUP_SMWZ.e600_s933_s946_r2302_r2300_p574/


Energy Smearing

related groups' twiki

My notes:



ROOT related

root > tree.Scan(....); >scan.log



如查看 当前目录大小,使用率等参数: df -h .

  • 求lumi总和(eg. In lumifile.txt, one number each line, what's the sum of these number?): awk '{s+=$1} END {print s}' lumifile.txt

PyRoot related

  • set up PyRoot at lxplus5:
    • source /afs/cern.ch/sw/lcg/contrib/gcc/4.3/x86_64-slc5-gcc43-opt/setup.sh
    • export ROOTSYS=/afs/cern.ch/sw/lcg/app/releases/ROOT/5.26.00/x86_64-slc5-gcc43-opt/root
    • export PATH=/afs/cern.ch/sw/lcg/external/Python/2.5.4p2/x86_64-slc5-gcc43-opt/bin:$ROOTSYS/bin:$PATH
    • export LD_LIBRARY_PATH=$ROOTSYS/lib:/afs/cern.ch/sw/lcg/external/Python/2.5.4p2/x86_64-slc5-gcc43-opt/lib:$LD_LIBRARY_PATH


  • check out:
    • export SVNROOT=svn+ssh://svn.cern.ch/reps/atlasgrp (or atlasoff for DataQuality stuff)
    • svn co $SVNROOT/Institutes/Goettingen/KLFitter/trunk/examples/top_ljets top

You can chage the second line according to your package path.

  • import:
    • export SVNAP=svn+ssh://svn.cern.ch/reps/apenson
    • svn import . $SVNAP/Analysis/myAnalysis -m "initial import"
  • set new tag
    • export SVNAP=svn+ssh://svn.cern.ch/reps/apenson
    • svn cp $SVNAP/Analysis/myAnalysis/trunk -r 1805 $SVNAP/Analysis/myAnalysis/tags/myAnalysis-00-02 -m "message"


  • easy way to check files on xrood: ll /xrootdfs/xrootd/
  • Run jobs:
    • cd arc_d3pd
    • arcond -allyes
  • arc_nevis_ls : show dirs and files. eg: arc_nevis_ls /data/xrootd/
  • copy files to xrood:
    • make a directory first on xrood: /data/users/common/xrootmkdir.sh /data/xrootd/...
    • then add data to that directory : /data/users/common/xrootadd.sh your-files-directory /data/xrootd/...
  • delete files on xrood:
    • open a clean shell
    • setupATLAS
    • localSetupGcc --gccVersion=gcc432_x86_64_slc5
    • localSetupROOT --rootVersion=5.26.00-slc5-gcc4.3
    • export LD_PRELOAD=$ROOTSYS/lib/libXrdPosixPreload.so
    • source /data/users/common/setupxrdposix.sh
    • rm root://xenia.nevis.columbia.edu:1094//data/xrootd/$datasetName/$fileName
  • condor_q : check jobs
  • condor_status -submitters: check jobs
  • arc_add : merge root files from Analysis.root to Analysis_all.root
  • arc_clean: clean unnecessary submission scripts
  • condor_rm ID: cancel jobs while they are running. ID is a job number. You can remove all jobs using the same command replacing the job ID with the user name.
  • To mkdir,add/remove files from condor: use the scripts under /data/users/common
  • specify input data in arcond.conf, but only one line is accepted so if you want to run over multiple files like PeriodA,PeriodB..., you can create multiple folders which contains arcond.conf with different input data
  • Remember to use "Analysis.root" as your output file name, so that arc_add can be used to combine files.
  • access files on xrood directly by adding "root://xenia.nevis.columbia.edu:1094/" before /data/xrootd/...:
eg. root root://xenia.nevis.columbia.edu:1094//data/xrootd/mc/NTUP_SMWZ/mc10b/mc10_7TeV.107681.AlpgenJimmyWenuNp1_pt20.merge.NTUP_SMWZ.e600_s933_s946_r2302_r2300_p574/NTUP.00065.root
  • Grid Ftp:(new way to dowloading data to xrootd)
xenia: NEVIS_GRIDFTP now in the Tiers of Atlas. For you: This means you can get a dataset by just going to http://panda.cern.ch/server/pandamon/query?mode=ddm_req and selecting NEVIS_GRIDFTP in the drop down menu, or on the cmd line with dq2-register-subscription-container YOURDATASET/ NEVIS_GRIDFTP Dataset goes straight to /atlas area in xrootd (two xrootd areas now, can both be seen at / xrootdfs/xrootd (old) and /xrootdfs/atlas).

My Analysis Package

  • Rules of the package
    • if you want to apply A && (B || C), then do A && B || C
    • if you want to apply (A && B) || C, then do C || A && B, but NOT A && B || C
  • What to do when data files updated or something else
    • use TChain::MakeClass to produce the basic files such as egamma.h & egamma.cxx, and replace the existing ones and add the following two lines in egamma.h:
      • #include < vector >
      • usng namespace std;
    • update goodrunlist file which specifies the .xml files

something you might need to know

Publication Templates

Egamma Meeting

  • 04/13/2011: https://indico.cern.ch/getFile.py/access?contribId=0&resId=0&materialId=slides&confId=116201
    • Notes:
      • No OQ maps (unless new problems appear). The quality of the egamma object is checked through the Object Quality (OQ) Flag
      • Do not apply the 2010 EnergyRescalerTool corrections to 2011 data (average corrections applied already at cell level in 16.6).
      • Use ‘nominal’ isEM menu for both electrons and photons.
        • i.e. loose, medium and tight bitmasks
        • No need anymore for *_WithTrackMatch for electrons nor special tight tune for photons
        • This is true until further notice (see ongoing electron IsEM reoptimisation next slide)

Plot style of ATLAS

Note Pad

  • the"LAr cleaning" is calculated with the egamma cluster whereas the Jet cleaning is calculated with a topo cluster


Summer schools

Interesting talks:

Thesis related:

Important talks

-- FengTian - 22-Nov-2010

Edit | Attach | Watch | Print version | History: r128 < r127 < r126 < r125 < r124 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r128 - 2012-12-01 - FengTian
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback