Instructions for git installations and patches

Taus :

Electron MVA ID :

Another Link for Electron MVA ID :

important path : /cmshome/khurana/CMSSW_6_1_2/src/HiggsAnalysis/CombinedLimit/FRCards/

electron to tau fake rate measurement using 2012 dataset


To measure electron -> hadronic tau fake rate using Single electron data set & compare this with MC truth information and do a MC closure test.

Procedure :

Using Tag & Probe method, where good electron is tag & jet passing the loose tau selection is probe. We measure the probability of probe (given loosely selected jet) passing the antielectron discriminators.

Dataset used :


JSON used :


Instructions to Run the code

The analysis to measure e->tau fake rate is divided into three steps :

1. Running EDAnalyzer on datasets and saving necessary information after applying some preselection.

2. Running postanalyzer : to apply cuts and save histograms needed for fitting (for analytic, template & combine)

3. Fitting : This is further divided into three steps :

a. combine

b. template

c. analytic

Each of the step is described in details in following sections:

EDAnalyzer :

Following packages are needed to be installed in your area to make this Tag & Probe code work fine :

cvs co -r cvs co -r V00-03-13      CommonTools/ParticleFlow                         
cvs co -r V00-04-00      CondFormats/EgammaObjects                        
cvs co -r V06-05-06-03   DataFormats/PatCandidates                        
cvs co -r V00-00-30-00 -d   EGamma/EGammaAnalysisTools       UserCode/EGamma/EGammaAnalysisTools                 
cvs co -r HCP2012_V03-02 EgammaAnalysis/ElectronTools                     
cvs co -r edm-30March2012 HiggsAnalysis/HiggsToWW2Leptons                  
cvs co -r V08-09-50      PhysicsTools/PatAlgos                            
cvs co -r V03-02-00      PhysicsTools/TagAndProbe                         
cvs co -r V09-00-01      RecoEgamma/EgammaTools                           
cvs co -r V15-01-11      RecoParticleFlow/PFProducer                      
cvs co -r V01-04-10      RecoTauTag/Configuration                         
cvs co -r V01-04-23      RecoTauTag/RecoTau                               

Code to save tree with relevant branches after making tag and probes (with some preselection applied) is located at my public :

cp -r /afs/ khurana

Once you have this look at in EToTauFakeRate/test/ which will save the output root files.

crab.cfg, multicrab to submit crab jobs on relevant samples & some helping shell script to automate the crab submission, resubmission and retriving task. Once all jobs are done make sure you check you don't have one rootfile more than once using the script

One this step is done you are ready for step two ie applying the selection and saving histograms


Code to select evnts for each sample is located in

cp -r /afs/ PostAnalyzer

fitter_tree.C is the main file where all cuts are applied.

uncomment or comment following line to run the code with for closure test or otherwise respectively. 
  if(!mcTrue) continue;

This will also save up down histograms for electron energy scale.

you can use a shell script named "compile" to compile the code using : source compile fitter_tree.C run.exe will run the code automatically.

for individual sample run the code like :

./run.exe pathofinputfile  pathofoutputfile

Once you have run the code for all data and MC you are ready to do the next step i.e. fitting

Fitting using combine

To measure the fake rate we use the postfit normalization of the pass and fail region. To get the posfit normalisation we use the combine's maximum likelihood method to fit the passing & failing probes simultaneously. This consist of preparing rootfiles and datacards which can be sued by combine as input to fit the histograms. Once datacards & rootfiles are prepared with proper normalisation we combine the datacards for pass & fail region and then run the combine tool.

We repeat this process first for data and then for pseudo data (i.e. considering sum of all background as data and then perform the fit) for closure test.

Script which do these jobs are located at will do most of the jobs of preparing the cards, rootfiles and merging cards, running maxliklihood and then extracting the numbers from the mlfit.root for both data and for closure test. Only thing to change is the path of the FRCards and vice versa.

MC Sample used with cross section used in analysis

Pile up reweighting

Selection Criteria

Tag Selection

Probe Selection

Anti electron discriminators used

Signal Model

Background Model

script used for simultaneous fit

Fit results

Fake rate values for various anti-electron discriminator

Results & Summary

Other links

CMS Web Based Monitoring : for trigger help

Config Browser : For trigger

EGamma HLT

EGamma ID

EGamma Cut Based ID

Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r6 - 2014-01-13 - RamanKhurana
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback