Tau Tagging on a PFJet

Complete: 5
This page is intended to document the information on the hadronic tau-jet reconstruction and identification starting from a PFJet.

Introduction

Tau lepton having a mass of 1.777 GeV, is the only lepton heavy enough to decay into hadrons. As depicted in the pi-chart, in about one third of the cases τís decay leptonically to a muon (τμ) or an electron (τe) with two neutrinos, and are reconstructed and identified with the usual techniques for muons and electrons. In the remaining cases, τ leptons decay hadronically, to a combination of charged and neutral mesons with a τν. Hadronically decaying τís, denoted by τh, are reconstructed and identified with the hadrons-plus-strips (HPS) algorithm, which was developed for use in the LHC Run-1. The key challenge that this algorithm has to face is the distinction between genuine τh, and quark and gluon jets, which are copiously produced in QCD multijet process and can be misidentified as τh. The main handle for reducing these jet→τh misidentification backgrounds is to utilize the fact that the particles produced in τh decays are of lower multiplicity, deposit energy in a narrow region compared to a quark or gluon jet, and are typically isolated with respect to other particles in the event. In some physics analyses, the misidentification of electrons or muons as τh candidates may constitute a sizeable background as well. Therefore, HPS algorithm has got various discriminators like isolation, against electrons and muons etc. to identify genuine hadronically decaying taus.

TauDecayPiChart.png

Tau Reconstruction and Identification Software

The hadronic tau-jet candidates are created from ParticleFlow jets using hadron-plus-strip (HPS) algorithm. The tau reconstruction code can be found in:

The details on HPS algorithm can be found at:

Creation

Rerun the tau sequences, load the tau steering sequence in your cfg file: process.load("RecoTauTag.Configuration.RecoPFTauTag_cff")

On RECO level reco::PFTaus are used to store all possible taus jets. The result of the various discriminators is stored in reco::PFTauDiscriminator.

The initial PFTaus collection will first contain one tau candidate for each reconstructed PFJet (using the anti-kt algorithm with a cone size of 0.5). The jet direction is set to the direction of the leading charged hadron in the jet (highest pT), which should be within ∆R = 0.1 with respect to the jet axis. In this stage the signal- and isolation cones as well as the decay mode are defined. Subsequently the HPS algorithm create PFTauDiscriminators which can be used to select a collection of PFTaus.

hadrTauJetIsol.png

Usage

in your event loop (e.g. the analyze(const edm::Event& iEvent) method of your analyzer) do

// read PFTau collection (set pfTauToken_ in to an edm::InputTag before)

     edm::EDGetTokenT<reco::PFTauCollection> pfTauToken_;
     edm::Handle<reco::PFTauCollection> pfTaus;
     iEvent.getByToken(pfTauToken_,pfTaus);

//read one PFTauDiscriminator (set discriminatorToken_ in to an edm::InputTag before)

    edm::EDGetTokenT<reco::PFTauDiscriminator> discriminatorToken_;
    edm::Handle<reco::PFTauDiscriminator> discriminator;
    iEvent.getByToken(discriminatorToken_,discriminator);

// loop over taus
    for ( unsigned iTau = 0; iTau < pfTaus->size(); ++iTau ) {
        reco::PFTauRef tauCandidate(pfTaus, iTau);
// check if tau candidate has passed discriminator
        if( (*discriminator)[tauCandidate] > 0.5 ){
        // do something with your candidate
        }
    }

Discriminators

The PFTauDiscriminators are used store the result of the various tau tagging algorithms and select jets that likely originate from a hadronic tau decay. By definition they are real numbers between 0 and 1 where higher numbers indicate a more positive outcome of the discriminator. Note that most discriminators are in fact binary thus taking just values of 0 (=did not pass) and 1 (=passed).

The names PFTauDiscriminator collections follow the <algorithm-prefix> + <discriminator name> convention. (e.g. the AgainstElectron collection can be accessed via hpsPFTauDiscriminationAgainstElectron)

Legacy Tau ID (Run I)

The <algorithm-prefix> is hpsPFTauDiscrimination

Name binary? Description
ByLooseElectronRejection yes electron pion MVA discriminator < 0.6
ByMediumElectronRejection yes electron pion MVA discriminator < -0.1 and not 1.4442 < η < 1.566 (both positive & negative eta values)
ByTightElectronRejection yes electron pion MVA discriminator < -0.1 and not 1.4442 < η < 1.566 (both positive & negative eta values) and Brem pattern cuts (see AN-10-387)
ByMVA3Loose/Medium/Tight/VTightElectronRejection yes anti-electron MVA discriminator with improved training (see talk)
ByMVA3VTightElectronRejection yes anti-electron MVA discriminator with improved training (same efficiency as "HCP 2012 working point" (see talk)
ByLooseMuonRejection yes Tau Lead Track not matched to chamber hits
ByMediumMuonRejection yes Tau Lead Track not matched to global/tracker muon
ByTightMuonRejection yes Tau Lead Track not matched to global/tracker muon and large enough energy deposit in ECAL + HCAL
ByLooseMuonRejection2 yes Same as AgainstMuonLoose
ByMediumMuonRejection2 yes Loose2 && no DT, CSC or RPC Hits in last 2 Stations
ByTightMuonRejection2 yes Medium2 && large enough energy deposit in ECAL + HCAL in 1 prong + 0 strip decay mode (Σ(ECAL+HCAL) > 0.2 * pT)
ByDecayModeFinding yes You will always want to use this (see AN-10-82 )
ByVLooseIsolation yes isolation cone of 0.3 , no PF Charged Candidates with pT > 1.5 GeV/c and no PF Gamma candidates with ET > 2.0 GeV
ByVLooseCombinedIsolationDBSumPtCorr yes isolation cone of 0.3 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 3 GeV
ByLooseCombinedIsolationDBSumPtCorr yes isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 2 GeV
ByMediumCombinedIsolationDBSumPtCorr yes isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 1 GeV
ByTightCombinedIsolationDBSumPtCorr yes isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 0.8 GeV
ByLooseCombinedIsolationDBSumPtCorr3Hits yes same as ByLooseCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates
ByMediumCombinedIsolationDBSumPtCorr3Hits yes same as ByMediumCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates
ByTightCombinedIsolationDBSumPtCorr3Hits yes same as ByTightCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates
ByLooseIsolationMVA yes BDT based selection using isolation in rings around tau direction and shower shape variables
ByMediumIsolationMVA yes BDT based selection using isolation in rings around tau direction and shower shape variables
ByTightIsolationMVA yes BDT based selection using isolation in rings around tau direction and shower shape variables
ByIsolationMVAraw no output of BDT based selection using isolation in rings around tau direction and shower shape variables
ByLooseIsolationMVA2 yes same as ByLooseIsolationMVA with new training and improved performance
ByMediumIsolationMVA2 yes same as ByMediumIsolationMVA with new training and improved performance
ByTightIsolationMVA2 yes same as ByTightIsolationMVA with new training and improved performance
ByIsolationMVA2raw no output of "MVA2" BDT discriminator

Isolation discriminators usage

Loosening of quality cuts on tracks of isolation candidates means that more of them can enter isolation pt. This results in slightly decreased efficiency of tau identification but also considerably decreased jet fake rate. More details in this talk.

The combined isolation is recommended for all taus, while MVA based isolation can provide better performance for low pt (< 100 GeV) taus. Since RecoTauTag/RecoTau V01-04-23 (V01-04-23-4XX-00 for 4XX analysis) a new training for MVA isolation is available. The discriminators have "IsolationMVA2" in their name.

"MVA3" anti-electron discriminators

The new MVA training provides working points with decreased electron fake rate when keeping the same efficiency as for previous training. All MVA3 discriminators veto tau candidates in a crack region between barrel and endcap, so they are NOT RECOMMENDED if you are using tau id for TAU VETO in your analysis.

"AgainstMuon2" discriminators

A drop of efficiency of anti-muon discriminator have been observed in high pT. The fix is available in "Muon2" discriminators. More details in ( talk 1 and talk 2)

int A = tau.signalPFChargedHadrCands().size()
int B = tau.signalPFGammaCands().size()

Mode A B
one prong 1 0
one prong + pi0 1 >0
three prong 3 0

Tau ID 2014 (preparation for Run II)

The <algorithm-prefix> is hpsPFTauDiscrimination

Name binary? Description
ByLooseElectronRejection yes electron pion MVA discriminator < 0.6
ByMediumElectronRejection yes electron pion MVA discriminator < -0.1 and not 1.4442 < η < 1.566 (both positive & negative eta values)
ByTightElectronRejection yes electron pion MVA discriminator < -0.1 and not 1.4442 < η < 1.566 (both positive & negative eta values) and Brem pattern cuts (see AN-10-387)
ByMVA5(Loose/Medium/Tight/VTight)ElectronRejection yes anti-electron MVA discriminator with new training
ByLooseMuonRejection yes Tau Lead Track not matched to chamber hits
ByMediumMuonRejection yes Tau Lead Track not matched to global/tracker muon
ByTightMuonRejection yes Tau Lead Track not matched to global/tracker muon and energy deposit in ECAL + HCAL exceeding 0.2 times Lead Track momentum
ByLooseMuonRejection3 yes Tau Lead Track not matched to more than one segment in muon system, energy deposit in ECAL + HCAL at least 0.2 times Lead Track momentum
ByTightMuonRejection3 yes Tau Lead Track not matched to more than one segment or hits in the outermost two stations of the muon system, energy deposit in ECAL + HCAL at least 0.2 times Lead Track momentum
ByMVA(Loose/Medium/Tight)MuonRejection yes BDT based anti-muon discriminator
ByMVArawMuonRejection no raw MVA output of BDT based anti-muon discriminator
ByDecayModeFinding yes You will always want to use this (see AN-10-82 )
ByVLooseIsolation yes isolation cone of 0.3 , no PF Charged Candidates with pT > 1.5 GeV/c and no PF Gamma candidates with ET > 2.0 GeV
ByVLooseCombinedIsolationDBSumPtCorr yes isolation cone of 0.3 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 3 GeV
ByLooseCombinedIsolationDBSumPtCorr yes isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 2 GeV
ByMediumCombinedIsolationDBSumPtCorr yes isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 1 GeV
ByTightCombinedIsolationDBSumPtCorr yes isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 0.8 GeV
ByLooseCombinedIsolationDBSumPtCorr3Hits yes same as ByLooseCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates
ByMediumCombinedIsolationDBSumPtCorr3Hits yes same as ByMediumCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates
ByTightCombinedIsolationDBSumPtCorr3Hits yes same as ByTightCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3oldDMwoLT yes BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong and 3-prong tau candidates
ByIsolationMVA3oldDMwoLTraw no raw MVA output of BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong and 3-prong tau candidates
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3oldDMwLT yes BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong and 3-prong tau candidates
ByIsolationMVA3oldDMwLTraw no raw MVA output of BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong and 3-prong tau candidates
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3newDMwoLT yes BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong, "2-prong" and 3-prong tau candidates
ByIsolationMVA3newDMwoLTraw no raw MVA output of BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong, "2-prong" and 3-prong tau candidates
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3newDMwLT yes BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong, "2-prong" and 3-prong tau candidates
ByIsolationMVA3newDMwLTraw no raw MVA output of BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong, "2-prong" and 3-prong tau candidates

How to get latest TauID on MiniAOD event content

While it is not possible to fully rebuild taus from jets given MiniAOD event content, it is possible to recompute the BDT output of both isolation and anti-electron discriminators for new trainings made available starting from CMSSW 8_0_X onwards. While for releases from 8_1_X onwards the necessary infrastructure is included in official CMSSW releases, for 8_0_X one has to merge developments from the following cms-tau-pog branch: cms-tau-pog/CMSSW_8_0_X_tau-pog_miniAOD-backport-tauID . In your CMSSW_8_0_X/src/ area do:

git cms-merge-topic -u cms-tau-pog:CMSSW_8_0_X_tau-pog_miniAOD-backport-tauID

The -u is necessary to avoid that git checks out all packages that depend on any of the ones touched in this branch since this would lead to a very long compilation. Then compile everything.

In order to be able to access the latest and greatest BDT output for the isolation discriminators and save it in your ntuples, you need to add a sequence to your python config file and some code to your analyzer itself. You can find an example analyzer and the corrsponding config file in RecoTauTag/RecoTau/test . Below, a code example for including the new training with old decay modes is shown. The procedure is the same for the training with new decay modes. Please refer to the TauIDRecommendation13TeV TWiki for the lines that need to be changed.

from RecoTauTag.RecoTau.TauDiscriminatorTools import noPrediscriminants
process.load('RecoTauTag.Configuration.loadRecoTauTagMVAsFromPrepDB_cfi')
from RecoTauTag.RecoTau.PATTauDiscriminationByMVAIsolationRun2_cff import *

process.rerunDiscriminationByIsolationMVArun2v1raw = patDiscriminationByIsolationMVArun2v1raw.clone(
   PATTauProducer = cms.InputTag('slimmedTaus'),
   Prediscriminants = noPrediscriminants,
   loadMVAfromDB = cms.bool(True),
   mvaName = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1"), # name of the training you want to use
   mvaOpt = cms.string("DBoldDMwLT"), # option you want to use for your training (i.e., which variables are used to compute the BDT score)
   requireDecayMode = cms.bool(True),
   verbosity = cms.int32(0)
)

process.rerunDiscriminationByIsolationMVArun2v1VLoose = patDiscriminationByIsolationMVArun2v1VLoose.clone(
   PATTauProducer = cms.InputTag('slimmedTaus'),    
   Prediscriminants = noPrediscriminants,
   toMultiplex = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw'),
   key = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw:category'),
   loadMVAfromDB = cms.bool(True),
   mvaOutput_normalization = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_mvaOutput_normalization"), # normalization fo the training you want to use
   mapping = cms.VPSet(
      cms.PSet(
         category = cms.uint32(0),
         cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff90"), # this is the name of the working point you want to use
         variable = cms.string("pt"),
      )
   )
)

# here we produce all the other working points for the training
process.rerunDiscriminationByIsolationMVArun2v1Loose = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone()
process.rerunDiscriminationByIsolationMVArun2v1Loose.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff80")
process.rerunDiscriminationByIsolationMVArun2v1Medium = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone()
process.rerunDiscriminationByIsolationMVArun2v1Medium.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff70")
process.rerunDiscriminationByIsolationMVArun2v1Tight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone()
process.rerunDiscriminationByIsolationMVArun2v1Tight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff60")
process.rerunDiscriminationByIsolationMVArun2v1VTight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone()
process.rerunDiscriminationByIsolationMVArun2v1VTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff50")
process.rerunDiscriminationByIsolationMVArun2v1VVTight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone()
process.rerunDiscriminationByIsolationMVArun2v1VVTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff40")

# this sequence has to be included in your cms.Path() before your analyzer which accesses the new variables is called.
process.rerunMvaIsolation2SeqRun2 = cms.Sequence(
   process.rerunDiscriminationByIsolationMVArun2v1raw
   *process.rerunDiscriminationByIsolationMVArun2v1VLoose
   *process.rerunDiscriminationByIsolationMVArun2v1Loose
   *process.rerunDiscriminationByIsolationMVArun2v1Medium
   *process.rerunDiscriminationByIsolationMVArun2v1Tight
   *process.rerunDiscriminationByIsolationMVArun2v1VTight
   *process.rerunDiscriminationByIsolationMVArun2v1VVTight
)

# embed new id's into new tau collection
embedID = cms.EDProducer("PATTauIDEmbedder",
   src = cms.InputTag('slimmedTaus'),
   tauIDSources = cms.PSet(
      byIsolationMVArun2v1DBoldDMwLTrawNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw'),
      byVLooseIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VLoose'),
      byLooseIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Loose'),
      byMediumIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Medium'),
      byTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Tight'),
      byVTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VTight'),
      byVVTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VVTight'),
      . . . (other discriminators like anti-electron),
      ),
   )
setattr(process, "NewTauIDsEmbedded", embedID)

process.p = cms.Path(
   . . . (other processes)
   *process.rerunMvaIsolation2SeqRun2
   *getattr(process, "NewTauIDsEmbedded")
   . . . (for example you ntuple creation process)
)

The python configuration to be added to include new trainings of the anti-electron discriminator is shown below.

process.load('RecoTauTag.Configuration.loadRecoTauTagMVAsFromPrepDB_cfi')
from RecoTauTag.RecoTau.PATTauDiscriminationAgainstElectronMVA6_cfi import *

process.rerunDiscriminationAgainstElectronMVA6 = patTauDiscriminationAgainstElectronMVA6.clone(
   PATTauProducer = cms.InputTag('slimmedTaus'),
   Prediscriminants = noPrediscriminants,
   #Prediscriminants = requireLeadTrack,
   loadMVAfromDB = cms.bool(True),
   returnMVA = cms.bool(True),
   method = cms.string("BDTG"),
   mvaName_NoEleMatch_woGwoGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_woGwoGSF_BL"),
   mvaName_NoEleMatch_wGwoGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_wGwoGSF_BL"),
   mvaName_woGwGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_woGwGSF_BL"),
   mvaName_wGwGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_wGwGSF_BL"),
   mvaName_NoEleMatch_woGwoGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_woGwoGSF_EC"),
   mvaName_NoEleMatch_wGwoGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_wGwoGSF_EC"),
   mvaName_woGwGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_woGwGSF_EC"),
   mvaName_wGwGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_wGwGSF_EC"),
   minMVANoEleMatchWOgWOgsfBL = cms.double(0.0),
   minMVANoEleMatchWgWOgsfBL  = cms.double(0.0),
   minMVAWOgWgsfBL            = cms.double(0.0),
   minMVAWgWgsfBL             = cms.double(0.0),
   minMVANoEleMatchWOgWOgsfEC = cms.double(0.0),
   minMVANoEleMatchWgWOgsfEC  = cms.double(0.0),
   minMVAWOgWgsfEC            = cms.double(0.0),
   minMVAWgWgsfEC             = cms.double(0.0),
   srcElectrons = cms.InputTag('slimmedElectrons')
)
# embed new id's into new tau collection
embedID = cms.EDProducer("PATTauIDEmbedder",
   src = cms.InputTag('slimmedTaus'),
   tauIDSources = cms.PSet(
      . . . (other new discriminators like isolation),
      againstElectronMVA6RawNew = cms.InputTag('rerunDiscriminationAgainstElectronMVA6')
      ),
   )
setattr(process, "NewTauIDsEmbedded", embedID)

process.p = cms.Path(
   . . . (other processes)
   *process.rerunDiscriminationAgainstElectronMVA6
   *getattr(process, "NewTauIDsEmbedded")
   . . . (for example you ntuple creation process)
)

Please be mindful that if you want to include new isolation and anti-electron discriminators at the same time, things like the PATTauIDEmbedder need only be run once. Just reorder/change the example python configuration snippets accordingly.

Once the new discriminators are embedded into the new tau collection (in this example called "NewTauIDsEmbedded") one can simply retrieve them via the pat::Tau::tauID function in a loop over the collection like so:

float newIsolationMVArawValue = tau->tauID("byIsolationMVArun2v1DBoldDMwLTrawNew");
float newAntiElectronMVArawValue = tau->tauID("againstElectronMVA6RawNew");

Rerunning of the TauID on AOD event content

The tau collections contained in the simulated samples are often outdated. The most recent tau ID is in general contained in the releases, so to benefit from all new features and bugfixes, you should re-run the PFTau sequence on RECO/AOD. Because the the most up-to-date software is almost always contained in the production releases, there is no need to merge in any code from other repositories.

To re-run the tau sequence in, you need to add following few lines to your config file

process.load("RecoTauTag.Configuration.RecoPFTauTag_cff") #loading the configuration

from PhysicsTools.PatAlgos.tools.tauTools import * # to get the most up-to-date discriminators when using patTaus
switchToPFTauHPS(process) # this line and the one above can be ignored when running on reco::PFTauCollection directly
...
process.p = cms.Path(   #adding to the path
....
        process.PFTau*
....
)

Warning, important When running in un-scheduled mode it is enough to add process.load("RecoTauTag.Configuration.RecoPFTauTag_cff") to the config file.

Important links for detailed information

More detailed information can be found at: twiki 1 and twiki 2.

Review status

Reviewer/Editor and Date Comments
Simone Gennai - 12 Sep 2007 Create the page with minimal informations
Evan Friis - 21 Sep 2008 Update the page with guide to using in CMSSW 2_1_X
Evan Friis - 20 Nov 2008 Update tags
Evan Friis - 27 Jan 2009 Remove outdated information
Nitish Dhingra - 7 Sep 2017 completely rewritten for Run2

Responsible: NitishDhingra
-- NitishDhingra - 2017-09-07

Topic attachments
I Attachment History Action Size Date Who Comment
PDFpdf TauDecayPiChart.pdf r1 manage 27.8 K 2017-10-03 - 22:24 MargueriteTonjes  
PNGpng TauDecayPiChart.png r1 manage 32.5 K 2017-10-03 - 22:24 MargueriteTonjes  
JPEGjpg Tau_Decay.jpg r1 manage 10.4 K 2017-10-03 - 22:24 MargueriteTonjes  
PNGpng hadrTauJetIsol.png r1 manage 492.5 K 2017-10-03 - 22:24 MargueriteTonjes  
Edit | Attach | Watch | Print version | History: r92 < r91 < r90 < r89 < r88 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r92 - 2017-10-03 - MargueriteTonjes


ESSENTIALS

ADVANCED TOPICS


 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback