process.load("RecoTauTag.Configuration.RecoPFTauTag_cff") #loading the configuration from PhysicsTools.PatAlgos.tools.tauTools import * # to get the most up-to-date discriminators when using patTaus switchToPFTauHPS(process) # this line and the one above can be ignored when running on reco::PFTauCollection directly ... process.p = cms.Path( #adding to the path .... process.PFTau* .... )
process.load("RecoTauTag.Configuration.RecoPFTauTag_cff")
to the config file.
cms-tau-pog/CMSSW_8_0_X_tau-pog_tauIDOnMiniAOD-legacy-backport-81X
. In your CMSSW_8_0_X/src/ area do:
git cms-merge-topic -u cms-tau-pog:CMSSW_8_0_X_tau-pog_tauIDOnMiniAOD-legacy-backport-81X # compliant with CMSSW_8_0_26 git cms-merge-topic -u cms-tau-pog:CMSSW_8_0_X_tau-pog_tauIDOnMiniAOD-legacy-backport-81Xv2 # compliant with CMSSW_8_0_29Note that the backport branch of choice depends on the CMSSW version that you are using. The
-u
is necessary to avoid that git checks out all packages that depend on any of the ones touched in this branch since this would lead to a very long compilation. Then compile everything.
In order to be able to access the latest and greatest BDT output for the isolation discriminators and save it in your ntuples, you need to add a sequence to your python config file and some code to your analyzer. You can find an example analyzerfrom RecoTauTag.RecoTau.TauDiscriminatorTools import noPrediscriminants process.load('RecoTauTag.Configuration.loadRecoTauTagMVAsFromPrepDB_cfi') from RecoTauTag.RecoTau.PATTauDiscriminationByMVAIsolationRun2_cff import * process.rerunDiscriminationByIsolationMVArun2v1raw = patDiscriminationByIsolationMVArun2v1raw.clone( PATTauProducer = cms.InputTag('slimmedTaus'), Prediscriminants = noPrediscriminants, loadMVAfromDB = cms.bool(True), mvaName = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1"), # name of the training you want to use mvaOpt = cms.string("DBoldDMwLT"), # option you want to use for your training (i.e., which variables are used to compute the BDT score) requireDecayMode = cms.bool(True), verbosity = cms.int32(0) ) process.rerunDiscriminationByIsolationMVArun2v1VLoose = patDiscriminationByIsolationMVArun2v1VLoose.clone( PATTauProducer = cms.InputTag('slimmedTaus'), Prediscriminants = noPrediscriminants, toMultiplex = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw'), key = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw:category'), loadMVAfromDB = cms.bool(True), mvaOutput_normalization = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_mvaOutput_normalization"), # normalization fo the training you want to use mapping = cms.VPSet( cms.PSet( category = cms.uint32(0), cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff90"), # this is the name of the working point you want to use variable = cms.string("pt"), ) ) ) # here we produce all the other working points for the training process.rerunDiscriminationByIsolationMVArun2v1Loose = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1Loose.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff80") process.rerunDiscriminationByIsolationMVArun2v1Medium = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1Medium.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff70") process.rerunDiscriminationByIsolationMVArun2v1Tight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1Tight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff60") process.rerunDiscriminationByIsolationMVArun2v1VTight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1VTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff50") process.rerunDiscriminationByIsolationMVArun2v1VVTight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1VVTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2016v1_WPEff40") # this sequence has to be included in your cms.Path() before your analyzer which accesses the new variables is called. process.rerunMvaIsolation2SeqRun2 = cms.Sequence( process.rerunDiscriminationByIsolationMVArun2v1raw *process.rerunDiscriminationByIsolationMVArun2v1VLoose *process.rerunDiscriminationByIsolationMVArun2v1Loose *process.rerunDiscriminationByIsolationMVArun2v1Medium *process.rerunDiscriminationByIsolationMVArun2v1Tight *process.rerunDiscriminationByIsolationMVArun2v1VTight *process.rerunDiscriminationByIsolationMVArun2v1VVTight ) # embed new id's into new tau collection embedID = cms.EDProducer("PATTauIDEmbedder", src = cms.InputTag('slimmedTaus'), tauIDSources = cms.PSet( byIsolationMVArun2v1DBoldDMwLTrawNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw'), byVLooseIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VLoose'), byLooseIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Loose'), byMediumIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Medium'), byTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Tight'), byVTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VTight'), byVVTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VVTight'), . . . (other discriminators like anti-electron), ), ) setattr(process, "NewTauIDsEmbedded", embedID) process.p = cms.Path( . . . (other processes) *process.rerunMvaIsolation2SeqRun2 *getattr(process, "NewTauIDsEmbedded") . . . (for example you ntuple creation process) )
process.load('RecoTauTag.Configuration.loadRecoTauTagMVAsFromPrepDB_cfi') from RecoTauTag.RecoTau.PATTauDiscriminationAgainstElectronMVA6_cfi import * process.rerunDiscriminationAgainstElectronMVA6 = patTauDiscriminationAgainstElectronMVA6.clone( PATTauProducer = cms.InputTag('slimmedTaus'), Prediscriminants = noPrediscriminants, #Prediscriminants = requireLeadTrack, loadMVAfromDB = cms.bool(True), returnMVA = cms.bool(True), method = cms.string("BDTG"), mvaName_NoEleMatch_woGwoGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_woGwoGSF_BL"), mvaName_NoEleMatch_wGwoGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_wGwoGSF_BL"), mvaName_woGwGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_woGwGSF_BL"), mvaName_wGwGSF_BL = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_wGwGSF_BL"), mvaName_NoEleMatch_woGwoGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_woGwoGSF_EC"), mvaName_NoEleMatch_wGwoGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_NoEleMatch_wGwoGSF_EC"), mvaName_woGwGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_woGwGSF_EC"), mvaName_wGwGSF_EC = cms.string("RecoTauTag_antiElectronMVA6v1_gbr_wGwGSF_EC"), minMVANoEleMatchWOgWOgsfBL = cms.double(0.0), minMVANoEleMatchWgWOgsfBL = cms.double(0.0), minMVAWOgWgsfBL = cms.double(0.0), minMVAWgWgsfBL = cms.double(0.0), minMVANoEleMatchWOgWOgsfEC = cms.double(0.0), minMVANoEleMatchWgWOgsfEC = cms.double(0.0), minMVAWOgWgsfEC = cms.double(0.0), minMVAWgWgsfEC = cms.double(0.0), srcElectrons = cms.InputTag('slimmedElectrons'), usePhiAtEcalEntranceExtrapolation = cms.bool(True) ) # embed new id's into new tau collection embedID = cms.EDProducer("PATTauIDEmbedder", src = cms.InputTag('slimmedTaus'), tauIDSources = cms.PSet( . . . (other new discriminators like isolation), againstElectronMVA6RawNew = cms.InputTag('rerunDiscriminationAgainstElectronMVA6') ), ) setattr(process, "NewTauIDsEmbedded", embedID) process.p = cms.Path( . . . (other processes) *process.rerunDiscriminationAgainstElectronMVA6 *getattr(process, "NewTauIDsEmbedded") . . . (for example you ntuple creation process) )
pat::Tau::tauID
function in a loop over the collection like so:
float newIsolationMVArawValue = tau->tauID("byIsolationMVArun2v1DBoldDMwLTrawNew"); float newAntiElectronMVArawValue = tau->tauID("againstElectronMVA6RawNew");
from <your path>.runTauIdMVA import * na = TauIDEmbedder(process, cms, # pass tour process object debug=True, toKeep = ["2017v2"] # pick the one you need: ["2017v1", "2017v2", "newDM2017v2", "dR0p32017v2", "2016v1", "newDM2016v1"] ) na.runTauID()3. Define handles to access discriminators in your analysis module:
byIsolationMVArun2017v2DBoldDMwLTraw2017 = cms.string('byIsolationMVArun2017v2DBoldDMwLTraw2017'), byVVLooseIsolationMVArun2017v2DBoldDMwLT2017 = cms.string('byVVLooseIsolationMVArun2017v2DBoldDMwLT2017'), byVLooseIsolationMVArun2017v2DBoldDMwLT2017 = cms.string('byVLooseIsolationMVArun2017v2DBoldDMwLT2017'), byLooseIsolationMVArun2017v2DBoldDMwLT2017 = cms.string('byLooseIsolationMVArun2017v2DBoldDMwLT2017'), byMediumIsolationMVArun2017v2DBoldDMwLT2017 = cms.string('byMediumIsolationMVArun2017v2DBoldDMwLT2017'), byTightIsolationMVArun2017v2DBoldDMwLT2017 = cms.string('byTightIsolationMVArun2017v2DBoldDMwLT2017'), byVTightIsolationMVArun2017v2DBoldDMwLT2017 = cms.string('byVTightIsolationMVArun2017v2DBoldDMwLT2017'), byVVTightIsolationMVArun2017v2DBoldDMwLT2017 = cms.string('byVVTightIsolationMVArun2017v2DBoldDMwLT2017')4. Add to your sequence the rerunning of tau reconstruction sequence with wanted MVA
process.p = cms.Path( <your processes not using new MVAs> * process.rerunMvaIsolationSequence * process.NewTauIDsEmbedded # *getattr(process, "NewTauIDsEmbedded") <rest of your processes>)For those who prefer to stick to the 2016 manner of tau MVA inclusion a code example for including the recent old decay mode 2017v1 and dR=0.3 2017v2 trainings w/ embedding into a new pat::Tau collection is shown below. To access the 2017v2 one has to replace "v1" by "v2". Please refer to the TauIDRecommendation13TeV TWiki for the lines that need to be changed.
from RecoTauTag.RecoTau.TauDiscriminatorTools import noPrediscriminants process.load('RecoTauTag.Configuration.loadRecoTauTagMVAsFromPrepDB_cfi') from RecoTauTag.RecoTau.PATTauDiscriminationByMVAIsolationRun2_cff import * def loadMVA_WPs_run2_2017(process): for training, gbrForestName in tauIdDiscrMVA_trainings_run2_2017.items(): process.loadRecoTauTagMVAsFromPrepDB.toGet.append( cms.PSet( record = cms.string('GBRWrapperRcd'), tag = cms.string("RecoTauTag_%s%s" % (gbrForestName, tauIdDiscrMVA_2017_version)), label = cms.untracked.string("RecoTauTag_%s%s" % (gbrForestName, tauIdDiscrMVA_2017_version)) ) ) for WP in tauIdDiscrMVA_WPs_run2_2017[training].keys(): process.loadRecoTauTagMVAsFromPrepDB.toGet.append( cms.PSet( record = cms.string('PhysicsTGraphPayloadRcd'), tag = cms.string("RecoTauTag_%s%s_WP%s" % (gbrForestName, tauIdDiscrMVA_2017_version, WP)), label = cms.untracked.string("RecoTauTag_%s%s_WP%s" % (gbrForestName, tauIdDiscrMVA_2017_version, WP)) ) ) process.loadRecoTauTagMVAsFromPrepDB.toGet.append( cms.PSet( record = cms.string('PhysicsTFormulaPayloadRcd'), tag = cms.string("RecoTauTag_%s%s_mvaOutput_normalization" % (gbrForestName, tauIdDiscrMVA_2017_version)), label = cms.untracked.string("RecoTauTag_%s%s_mvaOutput_normalization" % (gbrForestName, tauIdDiscrMVA_2017_version)) ) ) # 2017 v1 tauIdDiscrMVA_trainings_run2_2017 = { 'tauIdMVAIsoDBoldDMwLT2017' : "tauIdMVAIsoDBoldDMwLT2017", } tauIdDiscrMVA_WPs_run2_2017 = { 'tauIdMVAIsoDBoldDMwLT2017' : { 'Eff95' : "DBoldDMwLTEff95", 'Eff90' : "DBoldDMwLTEff90", 'Eff80' : "DBoldDMwLTEff80", 'Eff70' : "DBoldDMwLTEff70", 'Eff60' : "DBoldDMwLTEff60", 'Eff50' : "DBoldDMwLTEff50", 'Eff40' : "DBoldDMwLTEff40" } } tauIdDiscrMVA_2017_version = "v1" loadMVA_WPs_run2_2017(process) process.rerunDiscriminationByIsolationMVArun2v1raw = patDiscriminationByIsolationMVArun2v1raw.clone( PATTauProducer = cms.InputTag('slimmedTaus'), Prediscriminants = noPrediscriminants, loadMVAfromDB = cms.bool(True), mvaName = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1"), # name of the training you want to use: training with 2017 MC_v1 for oldDM mvaOpt = cms.string("DBoldDMwLTwGJ"), # option you want to use for your training (i.e., which variables are used to compute the BDT score) requireDecayMode = cms.bool(True), verbosity = cms.int32(0) ) process.rerunDiscriminationByIsolationMVArun2v1VLoose = patDiscriminationByIsolationMVArun2v1VLoose.clone( PATTauProducer = cms.InputTag('slimmedTaus'), Prediscriminants = noPrediscriminants, toMultiplex = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw'), key = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw:category'), loadMVAfromDB = cms.bool(True), mvaOutput_normalization = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_mvaOutput_normalization"), # normalization fo the training you want to use mapping = cms.VPSet( cms.PSet( category = cms.uint32(0), cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_WPEff90"), # this is the name of the working point you want to use variable = cms.string("pt"), ) ) ) # here we produce all the other working points for the training process.rerunDiscriminationByIsolationMVArun2v1VVLoose = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1VVLoose.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_WPEff95") process.rerunDiscriminationByIsolationMVArun2v1Loose = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1Loose.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_WPEff80") process.rerunDiscriminationByIsolationMVArun2v1Medium = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1Medium.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_WPEff70") process.rerunDiscriminationByIsolationMVArun2v1Tight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1Tight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_WPEff60") process.rerunDiscriminationByIsolationMVArun2v1VTight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1VTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_WPEff50") process.rerunDiscriminationByIsolationMVArun2v1VVTight = process.rerunDiscriminationByIsolationMVArun2v1VLoose.clone() process.rerunDiscriminationByIsolationMVArun2v1VVTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMwLT2017v1_WPEff40") # this sequence has to be included in your cms.Path() before your analyzer which accesses the new variables is called. process.rerunMvaIsolation2SeqRun2 = cms.Sequence( process.rerunDiscriminationByIsolationMVArun2v1raw *process.rerunDiscriminationByIsolationMVArun2v1VLoose *process.rerunDiscriminationByIsolationMVArun2v1VVLoose *process.rerunDiscriminationByIsolationMVArun2v1Loose *process.rerunDiscriminationByIsolationMVArun2v1Medium *process.rerunDiscriminationByIsolationMVArun2v1Tight *process.rerunDiscriminationByIsolationMVArun2v1VTight *process.rerunDiscriminationByIsolationMVArun2v1VVTight ) # 2017v2 dR=0.3 self.tauIdDiscrMVA_2017_version = "v2" self.tauIdDiscrMVA_trainings_run2_2017 = { 'tauIdMVAIsoDBoldDMdR0p3wLT2017' : "tauIdMVAIsoDBoldDMdR0p3wLT2017", } self.tauIdDiscrMVA_WPs_run2_2017 = { 'tauIdMVAIsoDBoldDMdR0p3wLT2017' : { 'Eff95' : "DBoldDMdR0p3wLTEff95", 'Eff90' : "DBoldDMdR0p3wLTEff90", 'Eff80' : "DBoldDMdR0p3wLTEff80", 'Eff70' : "DBoldDMdR0p3wLTEff70", 'Eff60' : "DBoldDMdR0p3wLTEff60", 'Eff50' : "DBoldDMdR0p3wLTEff50", 'Eff40' : "DBoldDMdR0p3wLTEff40" } } self.loadMVA_WPs_run2_2017() process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2raw = patDiscriminationByIsolationMVArun2v1raw.clone( PATTauProducer = cms.InputTag('slimmedTaus'), Prediscriminants = noPrediscriminants, loadMVAfromDB = cms.bool(True), mvaName = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2"), mvaOpt = cms.string("DBoldDMwLTwGJ"), requireDecayMode = cms.bool(True), verbosity = cms.int32(0) ) process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose = patDiscriminationByIsolationMVArun2v1VLoose.clone( PATTauProducer = cms.InputTag('slimmedTaus'), Prediscriminants = noPrediscriminants, toMultiplex = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2raw'), key = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2raw:category'), loadMVAfromDB = cms.bool(True), mvaOutput_normalization = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_mvaOutput_normalization"), mapping = cms.VPSet( cms.PSet( category = cms.uint32(0), cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_WPEff90"), variable = cms.string("pt"), ) ), verbosity = cms.int32(0) ) process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVLoose = process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose.clone() process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVLoose.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_WPEff95") process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Loose = process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose.clone() process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Loose.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_WPEff80") process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Medium = process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose.clone() process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Medium.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_WPEff70") process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Tight = process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose.clone() process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Tight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_WPEff60") process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VTight = process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose.clone() process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_WPEff50") process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVTight = process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose.clone() process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVTight.mapping[0].cut = cms.string("RecoTauTag_tauIdMVAIsoDBoldDMdR0p3wLT2017v2_WPEff40") process.rerunMvaIsolationSequence += cms.Sequence( process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2raw *process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose *process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVLoose *process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Loose *process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Medium *process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Tight *process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VTight *process.rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVTight ) # embed new id's into new tau collection embedID = cms.EDProducer("PATTauIDEmbedder", src = cms.InputTag('slimmedTaus'), tauIDSources = cms.PSet( byIsolationMVArun2v1DBoldDMwLTrawNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1raw'), byVLooseIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VLoose'), byVVLooseIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VVLoose'), byLooseIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Loose'), byMediumIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Medium'), byTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1Tight'), byVTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VTight'), byVVTightIsolationMVArun2v1DBoldDMwLTNew = cms.InputTag('rerunDiscriminationByIsolationMVArun2v1VVTight'), byIsolationMVArun2017v2DBoldDMdR0p3wLTraw2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2raw'), byVVLooseIsolationMVArun2017v2DBoldDMdR0p3wLT2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVLoose'), byVLooseIsolationMVArun2017v2DBoldDMdR0p3wLT2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VLoose'), byLooseIsolationMVArun2017v2DBoldDMdR0p3wLT2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Loose'), byMediumIsolationMVArun2017v2DBoldDMdR0p3wLT2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Medium'), byTightIsolationMVArun2017v2DBoldDMdR0p3wLT2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2Tight'), byVTightIsolationMVArun2017v2DBoldDMdR0p3wLT2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VTight'), byVVTightIsolationMVArun2017v2DBoldDMdR0p3wLT2017 = cms.InputTag('rerunDiscriminationByIsolationOldDMdR0p3MVArun2017v2VVTight') ), ) setattr(process, "NewTauIDsEmbedded", embedID) # inclusion in the process process.p += process.rerunMvaIsolation2SeqRun2 process.p += getattr(process, "NewTauIDsEmbedded") # then you can continue with your ntuple creation process for example
pat::Tau::tauID
function in a loop over the collection like so:
float newIsolationMVArawValue = tau->tauID("byIsolationMVArun2v1DBoldDMwLTrawNew");
Tau-Id | Name in pat::Tau accessed via tau.tauID(name) | Notes |
---|---|---|
DeepTau vs jets | byDeepTau2017v1VSjetraw | Raw DNN score |
by[WP]DeepTau2017v1VSjet | WP=VVVLoose,VVLoose,VLoose,Loose,Medium,Tight,VTight,VVTight | |
DeepTau vs e | byDeepTau2017v1VSeraw | Raw DNN score |
by[WP]DeepTau2017v1VSe | WP=VVVLoose,VVLoose,VLoose,Loose,Medium,Tight,VTight,VVTight | |
DeepTau vs μ | byDeepTau2017v1VSmuraw | Raw DNN score |
by[WP]DeepTau2017v1VSmu | WP=VVVLoose,VVLoose,VLoose,Loose,Medium,Tight,VTight,VVTight | |
DPFTau vs all (v0) | byDpfTau2016v0VSallraw | Raw DNN score, mostly against jets, some power against e |
byTightDpfTau2016v0VSall | Tight WP, the only one defined | |
DPFTau vs all (v1) | byDpfTau2016v1VSallraw | Raw DNN score, mostly against jets, some power against e |
byTightDpfTau2016v1VSall | Tight WP, dummy cuts |
[...] updatedTauName = "slimmedTausNewID" #name of pat::Tau collection with new tau-Ids import RecoTauTag.RecoTau.tools.runTauIdMVA as tauIdConfig tauIdEmbedder = tauIdConfig.TauIDEmbedder(process, cms, debug = False, updatedTauName = updatedTauName, toKeep = [ "2017v2", "dR0p32017v2", "newDM2017v2", #classic MVAIso tau-Ids "deepTau2017v1", #deepTau Tau-Ids "DPFTau_2016_v0", #D[eep]PF[low] Tau-Id ]) tauIdEmbedder.runTauID() # Path and EndPath definitions process.p = cms.Path( process.rerunMvaIsolationSequence * getattr(process,updatedTauName) ) [...]Note 1: Do not forget either store the updatedTauName collection to output file or read it in your ntuplizer.
rm -rf RecoTauTag/TrainingFiles/data
# Get DNN training files git clone https://github.com/cms-tau-pog/RecoTauTag-TrainingFiles -b master RecoTauTag/TrainingFiles/data # Execute the CMSSW job: cmsRun -j FrameworkJobReport.xml -p PSet.pyand it has to be added to Crab configuration
config.JobType.scriptExe = 'myscript.sh'
# Setup CMSSW area cmsrel CMSSW_10_2_16_UL cd CMSSW_10_2_16_UL/src/ cmsenv # Update DeepTau code and store DeepTauIDs in nanoAOD by a checkout from Tau POG repository # (note "-u" option preventing checkout of unnecessary stuff) git cms-merge-topic -u cms-tau-pog:CMSSW_10_2_X_tau-pog_DeepTau2017v2p1_nanoAOD # No needed to checkout training files for CMSSW>=10_2_16/10_2_16_UL # Compile scram b -j 42. (for miniAOD users) Add new DeepTauIDs to your CMSSW python configuration using the runTauIdMVA.py tool like this:
[...] updatedTauName = "slimmedTausNewID" #name of pat::Tau collection with new tau-Ids import RecoTauTag.RecoTau.tools.runTauIdMVA as tauIdConfig tauIdEmbedder = tauIdConfig.TauIDEmbedder(process, cms, debug = False, updatedTauName = updatedTauName, toKeep = ["deepTau2017v2p1", #deepTau TauIDs ]) tauIdEmbedder.runTauID() # Path and EndPath definitions process.p = cms.Path( process.rerunMvaIsolationSequence * getattr(process,updatedTauName) ) [...]Note 1: Do not forget either store the updatedTauName collection to output file or read it in your ntuplizer.
cmsDriver.py NanoAODv5_deepTauID --filein file:miniAOD2017v2.root --fileout file:nanoAODv5_deepTauID.root --mc --eventcontent NANOAODSIM --datatier NANOAODSIM --conditions auto:phase1_2017_realistic --step NANO --nThreads 2 --era Run2_2017 --no_exec -n 1000Following DeepTau IDs are accessible with this recipe:
Tau-Id![]() |
Name in pat::Tau accessed via tau.tauID(name) | Name in NanoAOD | Notes |
---|---|---|---|
DeepTau vs μ | byDeepTau2017v2p1VSmuraw | rawDeepTau2017v2p1VSmu | Raw DNN score |
DeepTau vs e | byDeepTau2017v2p1VSeraw | rawDeepTau2017v2p1VSe | Raw DNN score |
DeepTau vs jets | byDeepTau2017v2p1VSjetraw | rawDeepTau2017v2p1VSjet | Raw DNN score |
by[WP]DeepTau2017v2p1VSjet | idDeepTau2017v2p1VSjet | WP=VVVLoose,VVLoose,VLoose,Loose,Medium,Tight,VTight,VVTight | |
by[WP]DeepTau2017v2p1VSe | idDeepTau2017v2p1VSe | WP=VVVLoose,VVLoose,VLoose,Loose,Medium,Tight,VTight,VVTight | |
by[WP]DeepTau2017v2p1VSmu | idDeepTau2017v2p1VSmu | WP=VLoose,Loose,Medium,Tight |
cmsDriver.py miniAOD-prod -s PAT --eventcontent MINIAODSIM --runUnscheduled --mc --filein /store/relval/CMSSW_7_6_0_pre7/RelValZTT_13/GEN-SIM-RECO/76X_mcRun2_asymptotic_v5-v1/00000/0EF62A6C-4371-E511-8AAE-0025905A6118.root --conditions auto:run2_mc -n 100 --no_execAs long as you just read RelVals global tag is not very important as you do not read any conditions, so you can safely use automatic one, i.e. auto:run2_mc. By running the configuration file, you can get the MiniAOD with latest greatest tauID.
cmsrel CMSSW_6_1_2_SLHC4_patch2 cd CMSSW_6_1_2_SLHC4_patch2/src cmsenv git cms-merge-topic -u cms-tau-pog:CMSSW_6_1_2_SLHC4_patch2_tauLegacy2012Afterwards, you should rerun the tau sequence:
process.load("RecoTauTag.Configuration.RecoPFTauTag_cff")
to your cfg
process.PFTau
to your path
process.load("RecoTauTag.Configuration.RecoPFTauTag_cff")and add
process.recoTauClassicHPSSequence
to your path.
The configuration has been tested to work with PF2PAT and latest PAT recommendations as listed SWGuidePATReleaseNotes52X. More details about this recommendation available in the talkfrom PhysicsTools.PatAlgos.tools.tauTools import * switchToPFTauHPS(process)Changes wrt. recommendation presented in this talk
cmsrel CMSSW_5_3_12 cd CMSSW_5_3_12/src cmsenvA nutshell recipe for samples reconstructed with CMSSW>=5_3_12 (click here) Do not forget to rerun recoTauClassicHPSSequence (see above) as tau ID stored in AODs produced by CMSSW < 5_3_12 is outdated
cmsrel CMSSW_5_3_11_patch6 cd CMSSW_5_3_11_patch6/src cmsenv git cms-addpkg RecoTauTag/RecoTau git cms-merge-topic -u cms-tau-pog:CMSSW_5_3_XIf you wish to add new PAT features, please follow the recipes on PAT SW guide.
cmsrel CMSSW_5_3_9 cd CMSSW_5_3_9/src cmsenv cvs co -r V01-04-25 RecoTauTag/RecoTau #HCP + new discriminants cvs co -r V01-04-13 RecoTauTag/ConfigurationIf you are using some 52X release, you also need to do
cvs co -r V00-04-00 CondFormats/EgammaObjects # not necessary if CMSSW >= 5_3_0
export CVSROOT=":ext:<cern-user-account>@lxplus5.cern.ch:/afs/cern.ch/user/c/cvscmssw/public/CMSSW" # substitute <cern-user-account> by your lxplus username cvs co -r CMSSW_5_2_4 DataFormats/TauReco # yes, this is correct cvs co -r CMSSW_5_2_4 RecoTauTag/TauTagTools cvs co -r V01-04-25-4XX RecoTauTag/RecoTau #Legacy tau ID adapted for 4xx releases cvs co -r V01-04-12-4XX RecoTauTag/Configuration cvs co -r V00-04-00 CondFormats/EgammaObjects cvs co PhysicsTools/IsolationAlgos # You need to recompile PAT packages which depend on DataFormats/TauReco cvs co -r V08-07-53 PhysicsTools/PatAlgos # or higher. See https://twiki.cern.ch/twiki/bin/view/CMSPublic/SWGuidePATRecipes cvs co DataFormats/PatCandidates scram b -j 9Git based recipe: It is possible to obtain the recipe above also using git:
cmsrel CMSSW_4_4_5_patch5 #or higher cd CMSSW_4_4_5_patch5/src cmsenv git cms-addpkg RecoTauTag/RecoTau git cms-merge-topic -u cms-tau-pog:CMSSW_4_4_XIf everything compiles, you should re-run the PFTau sequence Afterwards, you should rerun the tau sequence:
process.load("RecoTauTag.Configuration.RecoPFTauTag_cff")
to your cfg
process.PFTau
to your path
cmsenv git cms-merge-topic -u cms-tau-pog:CMSSW_7_1_X_taus
cmsenv git cms-merge-topic -u cms-tau-pog:CMSSW_7_0_X_taus
cmsenv git cms-merge-topic -u cms-tau-pog:CMSSW_6_2_X_HighPt
cmsrel CMSSW_5_3_13_patch3 #or any CMSSW_5_3_x release >= 5_3_13 cd CMSSW_5_3_13_patch3/src cmsenv # git cms-merge-topic -u cms-tau-pog:CMSSW_5_3_X_boostedTaus_2013Dec17 # old recommendation git cms-merge-topic -u cms-tau-pog:CMSSW_5_3_X_tauID2014 # new recommendation with the same performance and new features (e.g. it is working in PFBRECO)This commands will provide you with the head of CMSSW_5_3_X_tauID2014 branch and merge it with your release (and whatever else you have in the working area).
process.load("RecoTauTag.Configuration.RecoPFTauTag_cff")
to your cfg
process.PFTau
to your path
from PhysicsTools.PatAlgos.tools.tauTools import * switchToPFTauHPS(process)
process.source = cms.Source("PoolSource", fileNames = cms.untracked.vstring( ... ), dropDescendantsOfDroppedBranches = cms.untracked.bool(False), inputCommands = cms.untracked.vstring( 'keep *', 'drop recoPFTaus_*_*_*' ) )Developer's version
cmsrel CMSSW_5_3_15 #or any CMSSW_5_3_x release >= 5_3_15 cd CMSSW_5_3_15/src cmsenv git cms-merge-topic -u cms-tau-pog:CMSSW_5_3_X_tauID2014 # branch with the latest version of the tau ID
git remote add tau-pog git@github.com:cms-tau-pog/cmssw.git # add new remote git fetch tau-pog # read from remote (get the list of branches and tags) git checkout -t remotes/tau-pog/CMSSW_5_3_X_tauID2014 # make a new local branch called CMSSW_5_3_X_tauID2014 that pulls from and pushes to ("tracks") CMSSW_5_3_X_tauID2014 branch in cms-tau-pog repository.When you are happy with local developments (using
git add/rm
and git commit
), the command git push
will directly update CMSSW_5_3_X_tauID2014
branch in cms-tau-pog repository.
git checkout -b TauHighPtTestFeature cms-tau-pog/CMSSW_5_3_X_tauID2014 # make a local branch TauHighPtTestFeature that is copy of remote branch CMSSW_5_3_X_tauID2014Do a local developments and commit to local git repository (using
git add/rm
and git commit
) and then, if you want, you can push the whole branch to your fork of cmmsw:
git push -u my-cmssw TauHighPtTestFeatureIf you want to have your changes included in "official" cms-tau-pog repository, use a pull request (
analyze(const edm::Event& iEvent)
method of your analyzer) do
// read PFTau collection (set tauSrc_ in to an edm::InputTag before) edm::Handle taus; iEvent.getByLabel(tauSrc_, taus); //read one PFTauDiscriminator (set discriminatorSrc_ in to an edm::InputTag before) edm::Handle discriminator; iEvent.getByLabel(discriminatorSrc_, discriminator); // loop over taus for ( unsigned iTau = 0; iTau < taus->size(); ++iTau ) { reco::PFTauRef tauCandidate(taus, iTau); // check if tau candidate has passed discriminator if( (*discriminator)[tauCandidate] > 0.5 ){ // do something with your candidate } }
<algorithm-prefix>
+ <discriminator name>
convention. (e.g. the AgainstElectron
collection can be accessed via hpsPFTauDiscriminationAgainstElectron
)
<algorithm-prefix>
is hpsPFTauDiscrimination
Name | binary? | Description | ||
---|---|---|---|---|
ByLooseElectronRejection |
yes | electron pion MVA discriminator < 0.6 | ||
ByMediumElectronRejection |
yes | electron pion MVA discriminator < -0.1 and not 1.4442 < | η | < 1.566 |
ByTightElectronRejection |
yes | electron pion MVA discriminator < -0.1 and not 1.4442 < | η | < 1.566 and Brem pattern cuts (see AN-10-387![]() |
ByMVA3Loose/Medium/Tight/VTightElectronRejection |
yes | anti-electron MVA discriminator with improved training (see talk![]() |
||
ByMVA3VTightElectronRejection |
yes | anti-electron MVA discriminator with improved training (same efficiency as "HCP 2012 working point" (see talk![]() |
||
ByLooseMuonRejection |
yes | Tau Lead Track not matched to chamber hits | ||
ByMediumMuonRejection |
yes | Tau Lead Track not matched to global/tracker muon | ||
ByTightMuonRejection |
yes | Tau Lead Track not matched to global/tracker muon and large enough energy deposit in ECAL + HCAL | ||
ByLooseMuonRejection2 |
yes | Same as AgainstMuonLoose |
||
ByMediumMuonRejection2 |
yes | Loose2 && no DT, CSC or RPC Hits in last 2 Stations | ||
ByTightMuonRejection2 |
yes | Medium2 && large enough energy deposit in ECAL + HCAL in 1 prong + 0 strip decay mode (Σ(ECAL+HCAL) > 0.2 * pT) | ||
ByDecayModeFinding |
yes | You will always want to use this (see AN-10-82![]() |
||
ByVLooseIsolation |
yes | isolation cone of 0.3 , no PF Charged Candidates with pT > 1.5 GeV/c and no PF Gamma candidates with ET > 2.0 GeV | ||
ByVLooseCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.3 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 3 GeV | ||
ByLooseCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 2 GeV | ||
ByMediumCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 1 GeV | ||
ByTightCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 0.8 GeV | ||
ByLooseCombinedIsolationDBSumPtCorr3Hits |
yes | same as ByLooseCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates |
||
ByMediumCombinedIsolationDBSumPtCorr3Hits |
yes | same as ByMediumCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates |
||
ByTightCombinedIsolationDBSumPtCorr3Hits |
yes | same as ByTightCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates |
||
ByLooseIsolationMVA |
yes | BDT based selection using isolation in rings around tau direction and shower shape variables | ||
ByMediumIsolationMVA |
yes | BDT based selection using isolation in rings around tau direction and shower shape variables | ||
ByTightIsolationMVA |
yes | BDT based selection using isolation in rings around tau direction and shower shape variables | ||
ByIsolationMVAraw |
no | output of BDT based selection using isolation in rings around tau direction and shower shape variables | ||
ByLooseIsolationMVA2 |
yes | same as ByLooseIsolationMVA with new training and improved performance |
||
ByMediumIsolationMVA2 |
yes | same as ByMediumIsolationMVA with new training and improved performance |
||
ByTightIsolationMVA2 |
yes | same as ByTightIsolationMVA with new training and improved performance |
||
ByIsolationMVA2raw |
no | output of "MVA2" BDT discriminator |
<algorithm-prefix>
is hpsPFTauDiscrimination
Name | binary? | Description | ||
---|---|---|---|---|
ByLooseElectronRejection |
yes | electron pion MVA discriminator < 0.6 | ||
ByMediumElectronRejection |
yes | electron pion MVA discriminator < -0.1 and not 1.4442 < | η | < 1.566 |
ByTightElectronRejection |
yes | electron pion MVA discriminator < -0.1 and not 1.4442 < | η | < 1.566 and Brem pattern cuts (see AN-10-387![]() |
ByMVA5(Loose/Medium/Tight/VTight)ElectronRejection |
yes | anti-electron MVA discriminator with new training | ||
ByLooseMuonRejection |
yes | Tau Lead Track not matched to chamber hits | ||
ByMediumMuonRejection |
yes | Tau Lead Track not matched to global/tracker muon | ||
ByTightMuonRejection |
yes | Tau Lead Track not matched to global/tracker muon and energy deposit in ECAL + HCAL exceeding 0.2 times Lead Track momentum | ||
ByLooseMuonRejection3 |
yes | Tau Lead Track not matched to more than one segment in muon system, energy deposit in ECAL + HCAL at least 0.2 times Lead Track momentum | ||
ByTightMuonRejection3 |
yes | Tau Lead Track not matched to more than one segment or hits in the outermost two stations of the muon system, energy deposit in ECAL + HCAL at least 0.2 times Lead Track momentum | ||
ByMVA(Loose/Medium/Tight)MuonRejection |
yes | BDT based anti-muon discriminator | ||
ByMVArawMuonRejection |
no | raw MVA output of BDT based anti-muon discriminator | ||
ByDecayModeFinding |
yes | You will always want to use this (see AN-10-82![]() |
||
ByVLooseIsolation |
yes | isolation cone of 0.3 , no PF Charged Candidates with pT > 1.5 GeV/c and no PF Gamma candidates with ET > 2.0 GeV | ||
ByVLooseCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.3 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 3 GeV | ||
ByLooseCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 2 GeV | ||
ByMediumCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 1 GeV | ||
ByTightCombinedIsolationDBSumPtCorr |
yes | isolation cone of 0.5 , Delta Beta corrected sum pT of PF charged and PF gamma isolation candidates (pT > 0.5 GeV) less than 0.8 GeV | ||
ByLooseCombinedIsolationDBSumPtCorr3Hits |
yes | same as ByLooseCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates |
||
ByMediumCombinedIsolationDBSumPtCorr3Hits |
yes | same as ByMediumCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates |
||
ByTightCombinedIsolationDBSumPtCorr3Hits |
yes | same as ByTightCombinedIsolationDBSumPtCorr but requiring 3 hits (instead of 8) on track of isolation candidates |
||
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3oldDMwoLT |
yes | BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong and 3-prong tau candidates | ||
ByIsolationMVA3oldDMwoLTraw |
no | raw MVA output of BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong and 3-prong tau candidates | ||
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3oldDMwLT |
yes | BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong and 3-prong tau candidates | ||
ByIsolationMVA3oldDMwLTraw |
no | raw MVA output of BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong and 3-prong tau candidates | ||
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3newDMwoLT |
yes | BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong, "2-prong" and 3-prong tau candidates | ||
ByIsolationMVA3newDMwoLTraw |
no | raw MVA output of BDT based tau ID discriminator based on isolation Pt sums, trained on 1-prong, "2-prong" and 3-prong tau candidates | ||
By(VLoose/Loose/Medium/Tight/VTight/VVTight)IsolationMVA3newDMwLT |
yes | BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong, "2-prong" and 3-prong tau candidates | ||
ByIsolationMVA3newDMwLTraw |
no | raw MVA output of BDT based tau ID discriminator based on isolation Pt sums plus tau lifetime information, trained on 1-prong, "2-prong" and 3-prong tau candidates |
I | Attachment | History | Action | Size | Date | Who | Comment |
---|---|---|---|---|---|---|---|
![]() |
2017v2-recipe.pdf | r1 | manage | 202.2 K | 2018-04-03 - 09:05 | OlenaHlushchenko | recipe to access 2017 v1-v2 trainings |
![]() |
OfflineTauIDcode_CMSSW_1_7_0_Page_1.png | r1 | manage | 90.1 K | 2007-09-12 - 19:02 | UnknownUser | |
![]() |
OfflineTauIDcode_CMSSW_1_7_0_sept12_Page_1.png | r1 | manage | 90.1 K | 2007-09-12 - 19:04 | UnknownUser | |
![]() |
OfflineTauIDcodescheme_CMSSW_1_7_0andfurther_1.png | r1 | manage | 1052.6 K | 2007-09-22 - 07:00 | LudovicHouchu | |
![]() |
OfflineTauIDcodescheme_CMSSW_1_7_0andfurther_1_10oct07.png | r1 | manage | 518.6 K | 2007-10-10 - 10:39 | LudovicHouchu | |
![]() |
OfflineTauIDcodescheme_CMSSW_1_7_0andfurther_2.png | r1 | manage | 1260.5 K | 2007-09-22 - 07:11 | LudovicHouchu | |
![]() |
OfflineTauIDcodescheme_CMSSW_1_7_0andfurther_3.png | r1 | manage | 1872.6 K | 2007-09-22 - 07:18 | LudovicHouchu | |
![]() |
hadrTauJetIsol.png | r1 | manage | 492.5 K | 2007-10-16 - 18:24 | LudovicHouchu | |
![]() |
tutorial.css | r1 | manage | 0.2 K | 2016-02-01 - 17:32 | RogerWolf |