This is a frozen copy of the SWGuideGlobalHLT twiki page, kept as a reference for older CMSSW releases and menus
Combining all groups of HLT triggers in a Global Table
Complete:
The HLT runs off
raw data files - specifically, off the single FEDRawDataCollection data structure; it can NOT be run off RECO or AOD files which do not keep this data structure. The HLT algorithms perform (regional) reconstruction as needed, and make trigger decisions. The list of HLT path names is reported on the
TriggerTables page.
How to run the production HLT within CMSSW
This part of the wiki describes the instructions on how to run the HLT contained in the CMSSW release you are using; no access to ConfDB is required.
This mode is typically used for large-scale Monte-Carlo event production.
Preparing a working area
The generic procedure to run the HLT in your favorite CMSSW release, for example CMSSW_5_3_23 (run-1 legacy release), or CMSSW_7_3_1 (release towards run-2), is as follows (written here for the most recent CMSSW_7_3_X release):
cmsrel CMSSW_7_3_2_patch1
cd CMSSW_7_3_2_patch1/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build
cd HLTrigger/Configuration/test
./cmsDriver.csh # execute the shell script which creates various ready-to-run cfg files
Using frozen Run-1 or Run-2 trigger menus
CMSSW releases typically contain several HLT menus inside
HLTrigger/Configuration
, at a minimum a menu for proton-proton collisions (called GRun), a menu for lead-lead heavy-ion collision (called HIon), and a menu for proton-lead collisions (called PIon). Depending on release, these menus are either for the run-1 or the run2 data taking period. For run-1 studies, you should use the run-1 legacy release
CMSSW_5_3_26
. For run-2 studies, you need to used the most recent CMSSW release, currently
CMSSW_7_3_2
.
A few checkpoint pp HLT menus used during LHC Run-1 are frozen and put into recent 53X, 62X, 70X, 71X, 72X, 73X CMSSW releases (in addition to the GRun, HIon and PIon menus discussed above). In the Run-1 legacy release CMSSW 53X, you'll find the 2011 5E33 menu and the 2012 menues 5E33v4, 7E33v2, 7E33v3, 7E33v4 and 8E33v2. In the later CMSSW releases, 62X, 70X, 71X, 72X, used for the preparation of run-2, we only keep the most recent 2012 pp menu, 8E33v2 (aka 2013) or 2014 (including half-rate paths), as a frozen menu. From CMSSW 73X onward, we have dropped all Run-1 menus.
Ready-to-run cmsRun cfg files such as
RelVal_HLT_GRun.py
are produced using cmsDriver: simply execute the shell script
../test/cmsDriver.csh
to create various workflow cfg files. (The cmsDriver parameter syntax to select a specific HLT menu is
-s HLT:GRun
; using
-s HLT
only defaults to the GRun menu. Also these frozen menus are accessible via the cmsDriver command-line, see the script
cmsDriver.sh
in
HLTrigger/Configuration/test
for examples.)
For example, edit the pp config file
RelVal_HLT_GRun.py
to use your favorite raw-data input file and run it with cmsRun:
cmsRun RelVal_HLT_GRun.py
To create a proper raw-data input file from a MC GEN-SIM file, in case the raw data file is no longer available somewhere, use
cmsRun RelVal_DigiL1Raw_XXXX.py
.
Edit this config file to specify your Monte Carlo input file (e.g., from CASTOR,
EOS, or locally on your PC).
For experts: The script "./getHLT.csh" creates the self-contained python cfg files called
../test/OnLine_HLT_XXXX_cfg.py
, as well as the python cff files for use with cmsDriver
../python/HLT_XXX_cff.py
, using ConfDB. The self-contained cfg files
../test/OnLine_HLT_XXXX_cfg.py
must yield identical HLT results to the cmsDriver generated cfg files
RelVal_HLT_XXXX.py
. This tests whether the online and cmsDriver configurations match properly, as the fomer are used for real data and the latter for MC simulation.
Below you'll find the instructions for specific CMSSW releases.
Check for bug fixes for the L1 trigger here:
SWGuideL1TriggerTags
PostLS1 Run-2 workflows in CMSSW_6_2_5+ used to produce TSG samples
Note: you need to use release CMSSW_6_2_5 or later; earlier 6_2_X releases contain HLT bugs.
The Fall-13 TSG production (prepid
TSG-Fall13dr-00001
) has been made using CMSSW_6_2_5 with globaltag POSTLS162_V2::All. See
here
for the production setup.
1) GEN-SIM step:
cmsDriver.py TTbar_Tauola_13TeV_cfi --conditions POSTLS162_V2::All -n 10 --eventcontent FEVTDEBUG --relval 9000,100 -s GEN,SIM --datatier GEN-SIM --customise SLHCUpgradeSimulations/Configuration/postLS1Customs.customisePostLS1 --geometry Extended2015 --magField 38T_PostLS1 --no_exec --fileout file:step1.root
2) DIGI,L1,DIGI2RAW,HLT step:
Simple case: if no VALIDATION step is needed subsequently::
cmsDriver.py step2 --conditions POSTLS162_V2::All -n 10 --eventcontent FEVTDEBUGHLT -s DIGI,L1,DIGI2RAW,HLT --datatier GEN-SIM-DIGI-RAW --customise SLHCUpgradeSimulations/Configuration/postLS1Customs.customisePostLS1 --geometry Extended2015 --magField 38T_PostLS1 --no_exec --filein file:step1.root --fileout file:step2.root
Extended case: if a subsequent step with VALIDATION is needed (need to use
DIGI:pdigi_valid
as DIGI step)
cmsDriver.py step2 --conditions POSTLS162_V2::All -n 10 --eventcontent FEVTDEBUGHLT -s DIGI:pdigi_valid,L1,DIGI2RAW,HLT --datatier GEN-SIM-DIGI-RAW --customise SLHCUpgradeSimulations/Configuration/postLS1Customs.customisePostLS1 --geometry Extended2015 --magField 38T_PostLS1 --no_exec --filein file:step1.root --fileout file:step2.root
3) RAW2DIGI,L1Reco,RECO,EI,VALIDATION,DQM step:
cmsDriver.py step3 --conditions POSTLS162_V2::All -n 10 --eventcontent FEVTDEBUGHLT,DQM -s RAW2DIGI,L1Reco,RECO,EI,VALIDATION,DQM --datatier GEN-SIM-RECO,DQM --customise SLHCUpgradeSimulations/Configuration/postLS1Customs.customisePostLS1 --geometry Extended2015 --magField 38T_PostLS1 --conditions POSTLS162_V1::All --no_exec --filein file:step2.root --fileout file:step3.root
4) HARVESTING step:
cmsDriver.py step4 -s HARVESTING:validationHarvesting+dqmHarvesting --customise SLHCUpgradeSimulations/Configuration/postLS1Customs.customisePostLS1 --geometry Extended2015 --mc --magField 38T_PostLS1 -n 100 --no_exec --filein file:step3_inDQM.root --fileout file:step4.root --conditions POSTLS162_V2::All
The customisation for the
HLT process
can also be added by hand in case you use your own cfg file, simply add at the end:
from SLHCUpgradeSimulations.Configuration.postLS1Customs import *
process = customise_HLT( process )
Look at the imported file for the customisations of other steps in the cmsDriver workflows.
Note that the customisation of the HLT menu itself is no longer needed in 72X/73X, as there the HLT is the run-2 (development) menu; it is only needed if you use
cmsDriver.py
to run the HLT, instead of using
hltGetConfiguration
to make a standalone cfg file.
Running the HLT with CMSSW_7_2_X and 7_3_X on the Fall'13 (6.2.x) TSG samples
The original global tags used to produce the Fall'13 TSG samples (produced with CMSSW_6_2_5 as explained above), were:
To run the HLT with CMSSW_7_2_X on the Fall'13 TSG samples (produced with CMSSW_6_2_5 as explained above), one should use the following global tags to use conditions that better match those used in the production of the MC samples:
Nota Bene: Since CondDB v2 is the default now for CMSSW >= 7.2.0, the suffix "::All" must be dropped from the name of the GT.
Note that post-LS1 samples produced with releases older than 7.2.0 do not pack the CSC digis in the raw data, but include them directly from the simulation step.
When using CMSSW 7.2.X or newer to process such samples, one needs to adapt the HLT configuration to read the simulated CSC digis, adding at the end of the HLT configuration:
process.hltCsc2DRecHits.wireDigiTag = cms.InputTag("simMuonCSCDigis","MuonCSCWireDigi")
process.hltCsc2DRecHits.stripDigiTag = cms.InputTag("simMuonCSCDigis","MuonCSCStripDigi")
Caveat: these instructions are incomplete.
- missing details about running the 2015 L1 emulator/menu/prescales/masks: STEAM will provide L1 skim files!
- missing details about repacking CSC and L1 digis when processing real data (a.k.a. L1REPACK for data)
Running the HLT with CMSSW_7_2_X and 7_3_X on the Spring'14 (7.0.x) TSG samples
The original global tags used to produce the Spring'14 TSG samples (produced with CMSSW_7_0_X), were:
To run the HLT with CMSSW_7_2_X on the Spring'14 TSG samples, one should use the Phys'14 global tags:
Here is the
comparison between POSTLS170_V7 and PHYS14_25_V1
, except for the
compatibility tags that have been added or removed:
Geometry and detectors |
CSCRecoDigiParametersRcd |
CSCRECODIGI_Geometry_50YV3 |
CSCRECODIGI_Geometry_2015_71YV3 |
EcalTBWeightsRcd |
EcalTBWeights_EBEE_BeamComm09_MC |
EcalTBWeights_3p5_time_mc |
GeometryFileRcd#Extended |
XMLFILE_Geometry_62YV6_Extended_mc |
XMLFILE_Geometry_2015_72YV5_Extended2015_mc |
GeometryFileRcd#Ideal |
XMLFILE_Geometry_62YV6_Ideal_mc |
XMLFILE_Geometry_2015_71YV5_Ideal_mc |
PGeometricDetExtraRcd |
TKExtra_Geometry_62YV6 |
TKExtra_Geometry_2015_71YV3 |
RPCClusterSizeRcd |
RPCClusterSize_Upscope_mc |
RPCClusterSize_2012D_mc |
L1 |
L1MuDTEtaPatternLutRcd |
L1MuDTEtaPatternLut_091022_v1 |
L1MuDTEtaPatternLut_V120924_v1 |
L1MuDTExtLutRcd |
L1MuDTExtLut_091022_v1 |
L1MuDTExtLut_V120924_v1 |
L1MuDTPhiLutRcd |
L1MuDTPhiLut_091022_v1 |
L1MuDTPhiLut_V120924_v1 |
L1MuDTPtaLutRcd |
L1MuDTPtaLut_091022_v1 |
L1MuDTPtaLut_V120924_v1 |
L1MuDTQualPatternLutRcd |
L1MuDTQualPatternLut_091022_v1 |
L1MuDTQualPatternLut_V120924_v1 |
L1RPCBxOrConfigRcd |
L1RPCBxOrConfig_LHC8_mc |
L1RPCBxOrConfig_LHC9_mc |
L1RPCConeDefinitionRcd |
L1RPCConeDefinition_LHC8_mc |
L1RPCConeDefinition_LHC9_mc |
L1RPCConfigRcd |
L1RPCConfig_LHC8_mc |
L1RPCConfig_LHC9_mc |
L1RPCHsbConfigRcd |
L1RPCHsbConfig_LHC8_mc |
L1RPCHsbConfig_LHC9_mc |
Offline JECs |
JetCorrectionsRecord#AK5Calo |
JetCorrectorParametersCollection_Fall12_V5_MC_AK5Calo |
JetCorrectorParametersCollection_CSA14_V4_MC_AK5Calo |
JetCorrectionsRecord#AK5PF |
JetCorrectorParametersCollection_Fall12_V5_MC_AK5PF |
JetCorrectorParametersCollection_CSA14_V4_MC_AK5PF |
JetCorrectionsRecord#AK5PFchs |
JetCorrectorParametersCollection_Fall12_V5_MC_AK5PFchs |
JetCorrectorParametersCollection_CSA14_V4_MC_AK5PFchs |
JetCorrectionsRecord#AK7Calo |
JetCorrectorParametersCollection_Fall12_V5_MC_AK7Calo |
JetCorrectorParametersCollection_CSA14_V4_MC_AK7Calo |
JetCorrectionsRecord#AK7PF |
JetCorrectorParametersCollection_Fall12_V5_MC_AK7PF |
JetCorrectorParametersCollection_CSA14_V4_MC_AK7PF |
JetCorrectionsRecord#AK7PFchs |
JetCorrectorParametersCollection_Fall12_V5_MC_AK7PFchs |
JetCorrectorParametersCollection_CSA14_V4_MC_AK7PFchs |
Other |
BTagTrackProbability2DRcd |
TrackProbabilityCalibration_2D_2012_MC |
TrackProbabilityCalibration_2D_MC53X_v3 |
BTagTrackProbability3DRcd |
TrackProbabilityCalibration_3D_2012_mc |
TrackProbabilityCalibration_3D_MC53X_2011_v1 |
BTauGenericMVAJetTagComputerRcd |
MVAJetTags_v2_mc |
MVAComputerContainer_53X_JetTags_v3 |
For running the HLT, we do not care about the
Ofline JECs and
Other conditions, and we do want the latest
L1 conditions.
For what concerns the
Geometry and detectors payloads:
- according to AlCa
CSCRecoDigiParametersRcd
shoud have the same content, just with a different name;
- according to E/Gamma, we should use the updated
EcalTBWeightsRcd
payload, for the ECAL timing reconstruction;
- according to AlCa, the
GeometryFileRcd
, PGeometricDetExtraRcd
, and RPCClusterSizeRcd
are only used for the SIM and DIGI steps, so will not affect running from the RAW.
Nota Bene: Since CondDB v2 is the default now for CMSSW >= 7.2.0, the suffix "::All" must be dropped from the name of the GT.
Note that post-LS1 samples produced with releases older than 7.2.0 do not pack the CSC digis in the raw data, but include them directly from the simulation step.
When using CMSSW 7.2.X or newer to process such samples, one needs to adapt the HLT configuration to read the simulated CSC digis, adding at the end of the HLT configuration:
process.hltCsc2DRecHits.wireDigiTag = cms.InputTag("simMuonCSCDigis","MuonCSCWireDigi")
process.hltCsc2DRecHits.stripDigiTag = cms.InputTag("simMuonCSCDigis","MuonCSCStripDigi")
Caveat: these instructions are incomplete.
- missing details about running the 2015 L1 emulator/menu/prescales/masks: STEAM will provide L1 skim files!
- missing details about repacking CSC and L1 digis when processing real data (a.k.a. L1REPACK for data)
Running the HLT with CMSSW_7_0_X and 7_1_X on the Fall'13 (6.2.x) TSG samples
The original global tags used to produce the Fall'13 TSG samples (produced with CMSSW_6_2_5 as explained above), were:
- for 25 ns samples: POSTLS162_V2::All
- for 50 ns samples: POSTLS162_V1::All
To run the HLT with CMSSW_7_0_X on the Fall'13 TSG samples (produced with CMSSW_6_2_5 as explained above), one should use the following global tags to use conditions that better match those used in the production of the MC samples:
- for 25 ns samples: POSTLS170_V3::All
- for 50 ns samples: POSTLS170_V4::All
To run the HLT with CMSSW_7_1_0_pre8 and later on the Fall'13 TSG samples (produced with CMSSW_6_2_5 as explained above), one should use the following global tags to use conditions that better match those used in the production of the MC samples:
- for 25 ns samples: PRE_LS171_V5A::All
- for 50 ns samples: PRE_LS171_V6A::All
Caveat: these global tags are probably affected by a problem in HCAL Barrel.
While we wait for STEAM to check the impact of this problem, you can force your job to load the correct payload adding this snippet at the end of the HLT configuration:
process.GlobalTag.toGet.append(
cms.PSet(
record = cms.string( 'HcalRecoParamsRcd' ),
label = cms.untracked.string( '' ),
connect = cms.untracked.string( 'frontier://FrontierProd/CMS_COND_44X_HCAL' ),
tag = cms.string( 'HcalRecoParams_v8.0_mc' )
)
)
See also
here.
The customisation of the
HLT process
is needed in CMSSW 6_2_X, 7_0_X, and 7_1_X:
from SLHCUpgradeSimulations.Configuration.postLS1Customs import *
process = customise_HLT( process )
Running the HLT with CMSSW_7_0_X and 7_1_X on the Spring'14 (7.0.x) TSG samples
To run the HLT with CMSSW_7_0_X on the Spring'14 TSG samples, one should use the same global tags used in the production of the MC samples:
- for 25 ns samples: POSTLS170_V5::All (or the updated POSTLS170_V7::All)
- for 50 ns samples: POSTLS170_V6::All
Nota bene: there is also a special global tag, with "good" detector conditions (same as those for 25 ns), but for 50 ns:
POSTLS170_V6A::All
To run the HLT with CMSSW_7_1_0_pre8 and later on the Spring'14 TSG samples one should use the following global tags to use conditions that better match those used in the production of the MC samples:
- for 25 ns samples: PRE_LS171V9A::All
- for 50 ns samples: PRE_LS171V10A::All
See also
here.
The customisation of the
HLT process
is needed in CMSSW 6_2_X, 7_0_X, and 7_1_X:
from SLHCUpgradeSimulations.Configuration.postLS1Customs import *
process = customise_HLT( process )
Running the HLT with CMSSW_7_0_X on the 5.3.x TSG samples
To run the HLT with CMSSW_7_0_X on the Spring'14 TSG samples, one should use the same global tags used in the production of the MC samples:
- for 25 ns and 50 ns samples: START70_V6A::All
The 5.3.x samples use the Run 1 geometry, as such no customisation of the
HLT process
is needed.
Trigger development for Run-2
This part of the wiki describes how to develop new triggers and test them. In case you want your development integrated into the CMS trigger, pay attention to the workflow achieving this goal, documented in detail
here.
Specifically note that the current CMS development relase is 74X, while trigger development uses the stable platform of 73X releases. Thus, for any new
C++
code needed for your development, you need to provide a pull request for 74X and a pull request for 73X. For added parameters and functionality, you must use either a fillDescriptions(.) method or existsAs<.> to take care of older py configs not yet containing the new parameters. The new parameters' default values must be set in such a way so that the old behaviour of the module is reproduced! Also, the result of the code must not depend on whether any parameter is present in the py config or not, only on its default (when missing) or explicitly found (when present) value! This allows us to integrate code developments independently (and earlier) than updating the HLT menu (python config) itself with your changes, as the python update requires
ConfDB parsing to discover the new parameters, followed by a migration of the HLT menu within
ConfDB to the newly parsed code template.
Preparing a menu in ConfDB
The
pp development menu for Run-2 is located in
/dev/CMSSW_7_3_0/GRun
, derived from the master table in
/dev/CMSSW_7_3_0/HLT
: have a look at the recent developments of the
CMSSW_7_3_0 Run-2 menu. For the development of modified or new paths, you should start from the
most recent GRun table: copy it to your user area in ConfDB and use it as a starting point for your development. You may remove paths irrelevant for your task, but make sure you keep the
HLTriggerFirstPath
and
HLTriggerFinalPath
from the master or GRun table as the first and last path in your area, the latter just before any
Endpath
. The HLT menu is continuously developed, both in terms of physics and the technical infrastructure such as tracking or muon reconstruction. Thus, if you take too long time to develop, test and submit your request for integration, we'll ask you to re-implement your development based on the most recent master/GRun table.
Preparing a CMSSW developer area
The generic procedure is similar to creating a working area. Development in 2014/2015 must use CMSSW_7_3_X by now (as this is the release cycle in CMSSW compatible with the new 2015 L1 menu and seeds, and including the most-recent L1T developments). You need to use CMSSW_7_3_1 or later!
cmsrel CMSSW_7_3_1_patch2
cd CMSSW_7_3_1_patch2/src
cmsenv
git cms-addpkg HLTrigger/Configuration
#
# The following git command merges our (73X) development branch:
# 1) Sets the v2 L1 menu (L1Menu_Collisions2015_25ns_v2) as the default L1 menu
# (but you still need to rerun the L1 emulator in case you process 72X or earlier files)
# 2) Dumps into HLTrigger/Configuration the most-recent tested HLT menus, as also integrated in the CMSSW_7_4_X release series
#
git cms-merge-topic cms-tsg-storm:firstHltMenuFoL1MenuCollisions201525nsV2_73X
scram build
cd HLTrigger/Configuration/test
./cmsDriver.csh # execute the shell script which creates various ready-to-run cfg files
Note: Please, do not waste time using 7_1_X/7_2_X releases or older 7_3_X (pre)-releases for run-2 development.
Dumping the latest HLT menu configuration
Dumping a self-contained cfg file for direct use with cmsRun
The GRun/HIon/PIon configurations dumped into the HLTrigger/Configuration are always working in the corresponding release (HIon and PIon tables are simplistic placeholders in the Run-2 template menu). Development of these menus takes place in ConfDB. In case you live on the bleeding edge and want to try the most-recent integrated table from ConfDB, dump it with:
# Running on real data (reset HLT prescales all to 1)
hltGetConfiguration /dev/CMSSW_7_3_0/GRun --full --offline --data --unprescale --process TEST --globaltag auto:run2_hlt_GRun > hlt.py
or:
# Running on Monte Carlo simulation (reset HLT prescales all to 1)
hltGetConfiguration /dev/CMSSW_7_3_0/GRun --full --offline --mc --unprescale --process TEST --globaltag auto:run2_mc_GRun > hlt.py
Besides the
--data
versus
--mc
difference, note the different
GlobalTag
for processing real data versus Monte Carlo simulation. In case you want to process the special 62X or 70X TSG samples, use the special GTs as described
above, and add instructions do run the new L1 as described below.
To run your own menu containing only HLT paths (and eventually additional ESModules), but using the services and event setup from the master configuration, use
(for real data):
- extract the configurations (reseting HLT prescales all to 1 - needed in both steps below!)
edmConfigFromDB --cff --configName /dev/CMSSW_7_3_0/GRun --nopaths --services -PrescaleService,-EvFDaqDirector,-FastMonitoringService > setup_cff.py
hltGetConfiguration /your/test/menu/with/only/paths --full --offline --data --unprescale --process TEST --globaltag auto:run2_hlt_GRun > hlt.py
- add to
hlt.py
this line, just after process = cms.Process( "HLT" )
process.load("setup_cff")
In case of Monte-Carlo simulations, replace
--data
by
--mc
and
--globaltag auto:run2_hlt_GRun
by
--globaltag auto:run2_mc_GRun
. In case you want to process the special 62X or 70X TSG samples, use the special GTs as described
above, and add instructions do run the new L1 as described in the following. Remember:
HLTriggerFirstPath,HLTriggerFinalPath
as first and last path, the latter just before any endpath, in your own area!
Adding and running the new L1 emulator and new L1 menu
You need to use CMSSW_7_3_0 or later as described above. This is only neded to process raw files made in earlier releases, such as the TSG 62X and 70X samples.
Those releases emulated the old L1, so we need to add instructions to emulate the new L1:
Append
--l1-emulator 'stage1,gt' --l1Xml L1Menu_Collisions2015_25ns_v2_L1T_Scales_20141121_Imp0_0x1030.xml
to the hltGetConfiguration command,
like this:
hltGetConfiguration /dev/CMSSW_7_3_0/GRun --full --offline --mc --unprescale --process TEST --globaltag auto:run2_mc_GRun --l1-emulator 'stage1,gt' --l1Xml L1Menu_Collisions2015_25ns_v2_L1T_Scales_20141121_Imp0_0x1030.xml > hlt_stage1.py
In case you want to process the special 62X or 70X TSG samples, use the special GTs as described
above.
Dumping a configuration-fragment cff file for use with cmsDriver.py
If your own ConfDB area contains ONLY paths, and at most a subset of (possibly modified) setup modules, with the rest to be taken from the standard setup,
prepare the cff dump as follows:
# get standard setup
hltGetConfiguration --cff --offline /dev/CMSSW_7_3_0/GRun --paths HLTriggerFirstPath,HLTriggerFinalPath --unprescale > HLT_User_cff.py
# append user menu
hltGetConfiguration --cff --offline /your/test/menu/with/only/paths >> HLT_User_cff.py
Since your modules get appended to the cff file, they will overwrite those existing in the standard setup.
Despite the first
hltGetConfiguration
, you still need to have
HLTriggerFirstPath,HLTriggerFinalPath
as first and last path, the latter just before any endpath, in your own area!
If your area contains all setup modules required for your paths, prepare the cff dump in one go:
hltGetConfiguration --cff --offline /your/fully/self/contained/area > HLT_User_cff.py
Remember:
HLTriggerFirstPath,HLTriggerFinalPath
as first and last path, the latter just before any endpath, in your own area!
Move the resulting file
HLT_User_cff.py
to
src/HLTrigger/Configuration/python
and run
scram b
to compile it.
Now you can use
cmsDriver.py
, specifying the HLT step as
--step=HLT:User
. Remember to specify the correct
GlobalTag
and any required
PostLS1
customisation on the
cmsDriver.py
commandline, such as:
--customise=SLHCUpgradeSimulations/Configuration/postLS1Customs.customisePostLS1
.
If you need to rerun L1, to get the new L1, use a gen-sim file as input and specify:
--step DIGI,L1,DIGI2RAW,HLT:User --customise=SLHCUpgradeSimulations/Configuration/postLS1Customs.customisePostLS1
.
Using an up-to-date GlobalTag
The GlobalTag currently in the menu is for online running. Offline, you should use a globaltag matching the data or MC samples being used. To first approximation, one can use a "symbolic" GlobalTag which is defined in
Configuration/AlCa/python/autoCond.py
:
For real data:
hltGetConfiguration ... --data --globaltag auto:run2_hlt_GRun ....
For Monte-Carlo simulation:
hltGetConfiguration ... --mc --globaltag auto:run2_mc_GRun ....
For PostLS1 Monte-Carlo simulation:
hltGetConfiguration ... --mc --globaltag auto:run2_mc_GRun ...
For PostLS1 Monte-Carlo simulation at 50ns:
hltGetConfiguration ... --mc --globaltag auto:upgradePLS150ns ....
If needed for your studies, you can also provide an explicit GlobalTag as parameter for
--globaltag
.
See
above for the special GTs to be used with the special TSG samples!
Caveat please note that, in general, there is no guarantee that using a different CMSSW release will reproduce the same HLT results.
The global tag is just one of the things that may affect the results.
Testing your HLT paths
To test new paths before submitting them for integration, create an empty configuration with all the paths you are going to submit, and run the
hltIntegrationTests
script, either on real data:
hltIntegrationTests /your/test/menu/with/only/paths -s /dev/CMSSW_7_3_0/HLT -i some/input/data -x "--globaltag auto:run2_hlt_GRun"
or on MC:
hltIntegrationTests /your/test/menu/with/only/paths -s /dev/CMSSW_7_3_0/HLT -i some/input/data --mc -x "--globaltag auto:run2_mc_GRun"
Please, make sure that for
all paths tested
at least one event gets accepted in the test (you must check the log output for that).
The
-s
option is used to tell the script to take the setup (ESSources, ESProducers, Services) from the master table, so you don;t need to include them in your area. You
do need to include any
new module you may need.
The
-i
option is used to specify an input file, in any format understood by the
PoolSource
. It can be a local (file:...), a file on castor identified by its LFN (/store/...), or any other supported format.
The
= -x "--globaltag auto:hltonline"=
or == sets the globaltag for either real data or Monte-Carlo simulations. For the latter, also add
--mc
on the
hltIntegrationTests
command-line.
For more options, see
hltIntegrationTests -h
.
Submission for integration into CMSSW
Follow the instructions given
here.
frozen - preparation of the 2012/2013 HLT menues (pp and pPb collisions)
Studies for the 2012/2013 menus should be done using these instructions.
Preparing a working area
The 5e33 menu run in CMSSW 42X on the filter farm at the end of 2011 pp data taking has been ported to CMSSW 44X and 5YZ.
The final HIon menu run in CMSSW 44X on the filter farm at the end of 2011 has also been ported to CMSSW 5YZ. These menues
serve as the starting point for HLT development toward 2012 data taking.
Configure your new paths in your user area in ConfDB, using code template
CMSSW_5_2_8
. For reference, the master table is at
/dev/CMSSW_5_2_6/HLT
while the GRun subtable (pp collisions) is at
/dev/CMSSW_5_2_6/GRun
, the HIon subtable (heavy ion collisions) is at
/dev/CMSSW_5_2_6/HIon
and the new PIon subtable (proton ion collisions) is as
/dev/CMSSW_5_2_6/PIon
. Make sure you include the
HLTriggerFinalPath
from the master table as the last path in your area, just before any
Endpath
.
Since we are using CMSSW 52X+ for data taking in 2012/2013, make a developer area based on the most recent 52X (HLT) release, currently
CMSSW_5_2_9
.
export SCRAM_ARCH=slc5_amd64_gcc462 # if using bash
setenv SCRAM_ARCH slc5_amd64_gcc462 # if using [t]csh
cmsrel CMSSW_5_2_9
cd CMSSW_5_2_9/src
cmsenv
addpkg HLTrigger/Configuration
checkdeps -a
scram b -j4
hash -r # using bash
rehash # using [t]csh
Dumping the latest configuration
- The GRun and HIon configurations dumped into the HLTrigger/Configuration cvs tag listed above are always working as much as possible. Development of these menues takes place in ConfDB.
- In case you live on the bleeding edge and want to try the most-recent integrated table from ConfDB, dump it with:
hltGetConfiguration /dev/CMSSW_5_2_6/GRun --full --offline --data --unprescale --process TEST --globaltag auto:hltonline > hlt.py
- To run your own table containing only HLT paths (and eventually additional ESModules), using the services and event setup from the master configuration, use:
- extract the configurations:
edmConfigFromDB --cff --configName /dev/CMSSW_5_2_6/GRun --nopaths > setup_cff.py
hltGetConfiguration /your/test/menu/with/only/paths --full --offline --data --unprescale --process TEST --globaltag auto:hltonline > hlt.py
-
- add to
hlt.py
this line, just after process = cms.Process( "HLT" )
process.load("setup_cff")
Using an up-to-date GlobalTag
The GlobalTag currently in the menu is for online running. Offline, you should use a "symbolic" GlobalTag.
For data:
hltGetConfiguration ... --globaltag auto:hltonline_GRun
For MC:
hltGetConfiguration ... --globaltag auto:startup_GRun
Caveat please note that, in general, there is no guarantee that using a different CMSSW release will reproduce the same HLT results.
The global tag is just one of the things that may affect the results.
Testing your paths
To test new paths before submitting them for integration, create an empty configuration with all the paths you are going to submit, and run the
hltIntegrationTests
script:
hltIntegrationTests /your/test/menu/with/only/paths -s /dev/CMSSW_5_2_6/HLT -i some/input/data -x "--globaltag auto:hltonline"
Please, make sure that for
all paths tested
at least one event gets accepted in the test (you must check the log output for that).
The
-s
option is used to tell the script to take the setup (ESSources, ESProducers, Services) from the master table, so you don;t need to include them in your area. You
do need to include any
new module you may need.
The
-i
option is used to specify an input file, in any format understood by the
PoolSource
. It can be a local (file:...), a file on castor identified by its LFN (/store/...), or any other supported format.
The
= -x "--globaltag auto:hltonline"=
option is required because the current HLT menu has a GlobalTag which only works with CMSSW 5.1.x.
It will not be necessary once we have a 5.2.x-compliant GlobalTag.
For more options, see
hltIntegrationTests -h
.
Emulating the effect of the GCT 5 GeV jet seed threshold
Add this at the bottom of the output of hltGetConfiguration:
## GCT jet seed thresholds
JetFinderCentralJetSeed = 5.0
JetFinderForwardJetSeed = 5.0
## run GT emulator from RAW data (use RawToDigi_cff for MC)
process.load( 'Configuration.StandardSequences.RawToDigi_Data_cff' )
process.load( 'Configuration.StandardSequences.SimL1Emulator_cff' )
## override the GCT GT payload
process.load('L1TriggerConfig.GctConfigProducers.l1GctConfig_cfi')
process.L1GctConfigProducers.JetFinderCentralJetSeed = cms.double(JetFinderCentralJetSeed)
process.L1GctConfigProducers.JetFinderForwardJetSeed = cms.double(JetFinderForwardJetSeed)
import L1Trigger.Configuration.L1Trigger_custom
process = L1Trigger.Configuration.L1Trigger_custom.customiseL1GtEmulatorFromRaw( process )
process = L1Trigger.Configuration.L1Trigger_custom.customiseResetPrescalesAndMasks( process )
## customize the HLT to use the emulated results
import HLTrigger.Configuration.customizeHLTforL1Emulator
process = HLTrigger.Configuration.customizeHLTforL1Emulator.switchToL1Emulator( process, newGmtSetting = False, newGctSetting = True)
process = HLTrigger.Configuration.customizeHLTforL1Emulator.switchToSimGtReEmulGctDigis( process )
Running the new 2012 L1 menu
The 2012 L1 menues (v0, v1, v2, and v3) are available in the DB, and part of the corresponding static globaltags for Monte-Carlo (the one for v2 is being made).
If you run on older data and or MC, you need to run the emulator for the new L1 decisions:
- tell
hltGetConfiguration
to rerun the L1 emulator and pass the new L1 menu
hltGetConfiguration ... --l1-emulator --l1 L1GtTriggerMenu_L1Menu_Collisions2012_v3_mc
- tell
hltIntegrationTests
to rerun the L1 emulator and pass the new L1 menu
hltIntegrationTests ... -x "--l1-emulator" -x "--l1 L1GtTriggerMenu_L1Menu_Collisions2012_v3_mc"
Running the 2011 L1 menu
The most recent 2011 L1 menu is available in the most recent globaltags. If you need to reprocess older data or MC, originally process with an older 2011 L1 menu, then you need to run the L1 emulator to get the new L1 decisions.
- tell
hltGetConfiguration
to rerun the L1 emulator and pass the most recent L1 menu
hltGetConfiguration ... --l1-emulator --l1 L1GtTriggerMenu_L1Menu_Collisions2011_v6_mc
- tell
hltIntegrationTests
to rerun the L1 emulator, and pass the new L1 menu
hltIntegrationTests ... -x "--l1-emulator" -x "--l1 L1GtTriggerMenu_L1Menu_Collisions2011_v6_mc"
Running on MC samples
To create python dumps suitable for running on MC, use the
--mc
option in place of
--data
with
hltGetConfiguration
.
To run on MC, you should use the "startup" GlobalTag.
frozen - running the 2011 HIon menu
Configure your new paths in your user area in ConfDB, using code template
CMSSW_4_4_0_HLT17
.
For reference, the master table is at
/dev/CMSSW_4_4_2/HLT
while the HIon subtable is at
/dev/CMSSW_4_4_2/HIon
.
Make sure you include the
HLTriggerFinalPath
from the master table as the last path in your area, just before any
Endpath
.
Make a developer area based on
CMSSW_4_4_2_patch9
and add these tags:
cvs co -r V04-13-25 FastSimulation/Configuration
cvs co -r V13-00-45 HLTrigger/Configuration
checkdeps -a
Dumping the latest configuration
- The current menu has the latest online global tag, so it should run on data with CMSSW 4.4.2 out of the box. See below for running on montecarlo samples.
- In case you live on the bleeding edge and want to try the most-recent integrated HIon table, dump it with:
hltGetConfiguration /dev/CMSSW_4_4_2/HIon --full --offline --data --unprescale --process TEST > hlt.py
- To run your own table containing only HLT paths (and eventually additional ESModules), using the services and event setup from the master configuration, use:
- extract the configurations:
edmConfigFromDB --cff --configName /dev/CMSSW_4_4_2/HIon --nopaths > setup_cff.py
hltGetConfiguration /your/test/menu/with/only/paths --full --offline --data --unprescale --process TEST > hlt.py
-
- add to
hlt.py
this line, just after process = cms.Process( "HLT" )
process.load("setup_cff")
Testing your paths
To test new paths before submitting them for integration, create an empty configuration with all the paths you are going to submit, and run the
hltIntegrationTests
script:
hltIntegrationTests /your/test/menu/with/only/paths -s /dev/CMSSW_4_4_2/HLT -i some/input/data
The
-s
option is used to tell the script to take the setup (ESSources, ESProducers, Services) from the master table, so you don;t need to include them in your area. You
do need to include any
new module you may need.
The
-i
option is used to specify an input file, in any format understood by the
PoolSource
. It can be a local (file:...), a file on castor identified by its LFN (/store/...), or any other supported format.
For more options, see
hltIntegrationTests -h
.
frozen - running the 2011 "legacy 5e33 pp" menu (for MC productions in 53X)
The 2011 "5e33" menus were run with CMSSW release 4_2_X. An emulation of that menu (the last version of it) that can run with CMSSW_5_3_X was provided, in order to be able to produce MC samples with HLT in that release (for the legacy samples).
Preparing a working area
For reference, the GRun subtable (pp collisions) is at
/online/collisions/2011/5e33/v3.5/HLT
.
Make a developer area based on
CMSSW_5_3_14
setenv SCRAM_ARCH slc5_amd64_gcc462
cmsrel CMSSW_5_3_14
cd CMSSW_5_3_14/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build -j 4
cd HLTrigger/Configuration/test
Running on MC samples
This legacy 2011 menu is only intended to run on MC productions.
To create python dumps suitable for running on MC, use the
--mc
option in place of
--data
with
hltGetConfiguration
.
The Global Tag to be used, and which also includes the correct 2011 L1 menu, is
START53_LV2
, mapped in the
auto:startup_2011
condtions.
Typical cmsDriver instructions to produce the configurations for those MC productions are (for example):
Fullsim:
cmsDriver.py --step DIGI,L1,DIGI2RAW,HLT:2011 --conditions=auto:startup_2011 --filein=/store/relval/CMSSW_5_2_7-START52_V10/RelValProdTTbar/GEN-SIM/v1/00000/1E51943C-5306-E211-A13F-0018F3D096A2.root --number=10 --mc --datatier GEN-SIM-DIGI-RAW-HLT --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --scenario=pp --processName=HLT
FastSim:
cmsDriver.py TTbar_Tauola_8TeV_cfi --step=GEN,FASTSIM,HLT:2011 --conditions=auto:startup_2011 --mc --datatier GEN-SIM-DIGI-RECO eventcontent=FEVTDEBUGHLT --number=10 --customise=HLTrigger/Configuration/CustomConfigs.FASTSIM --scenario=pp --processName=HLT
frozen - running the 2011 "5e33" menu
Studies for the 2011 "5e33" (and higher) menu should be done using these instructions.
Preparing a working area
Configure your new paths in your user area in ConfDB, using code template
CMSSW_4_2_0_HLT33
.
For reference, the master table is at
/dev/CMSSW_4_2_0/HLT
while the GRun subtable (pp collisions) is at
/dev/CMSSW_4_2_0/GRun
.
Make sure you include the
HLTriggerFinalPath
from the master table as the last path in your area, just before any
Endpath
.
Make a developer area based on
CMSSW_4_2_9_HLT3_hltpatch3
export SCRAM_ARCH=slc5_amd64_gcc434
cmsrel CMSSW_4_2_9_HLT3_hltpatch3
cd CMSSW_4_2_9_HLT3_hltpatch3/src
cmsenv
cvs co -r V02-25-00 FastSimulation/Configuration
cvs co -r V12-03-07 HLTrigger/Configuration
checkdeps -a
scram b -j8
hash -r
setenv SCRAM_ARCH slc5_amd64_gcc434
cmsrel CMSSW_4_2_9_HLT3_hltpatch3
cd CMSSW_4_2_9_HLT3_hltpatch3/src
cmsenv
cvs co -r V02-25-00 FastSimulation/Configuration
cvs co -r V12-03-07 HLTrigger/Configuration
checkdeps -a
scram b -j8
rehash
Dumping the latest configuration
- The current menu has the latest online global tag, so it should run on data with CMSSW 4.2.x out of the box. See below for running on montecarlo samples.
- In case you live on the bleeding edge and want to try the most-recent integrated pp table, dump it with:
hltGetConfiguration /dev/CMSSW_4_2_0/GRun --full --offline --data --unprescale --process TEST > hlt.py
- To run your own table containing only HLT paths (and eventually additional ESModules), using the services and event setup from the master configuration, use:
- extract the configurations:
edmConfigFromDB --cff --configName /dev/CMSSW_4_2_0/GRun --nopaths > setup_cff.py
hltGetConfiguration /your/test/menu/with/only/paths --full --offline --data --unprescale --process TEST > hlt.py
-
- add to
hlt.py
this line, just after process = cms.Process( "HLT" )
process.load("setup_cff")
Testing your paths
To test new paths before submitting them for integration, create an empty configuration with all the paths you are going to submit, and run the
hltIntegrationTests
script:
hltIntegrationTests /your/test/menu/with/only/paths -s /dev/CMSSW_4_2_0/HLT -i some/input/data
The
-s
option is used to tell the script to take the setup (ESSources, ESProducers, Services) from the master table, so you don;t need to include them in your area. You
do need to include any
new module you may need.
The
-i
option is used to specify an input file, in any format understood by the
PoolSource
. It can be a local (file:...), a file on castor identified by its LFN (/store/...), or any other supported format.
For more options, see
hltIntegrationTests -h
.
Running the L1 emulator
The "5e33" menu will use the L1 menu: GlobalTriggerMenu_L1Menu_Collisions2011_v6 .
To run on older data or MC, you need to (re)run the L1 GT emulator:
- tell
hltGetConfiguration
to rerun the L1 emulator, and pass the new L1 menu
hltGetConfiguration ... --l1-emulator --l1 L1GtTriggerMenu_L1Menu_Collisions2011_v6_mc
- tell
hltIntegrationTests
to rerun the L1 emulator, and pass the new L1 menu
hltIntegrationTests ... -x "--l1-emulator" -x "--l1 L1GtTriggerMenu_L1Menu_Collisions2011_v6_mc"
Running on MC samples
To create python dumps suitable for running on MC, use the
--mc
option in place of
--data
with
hltGetConfiguration
.
To run on MC, you should use the "startup" or "mc" GlobalTags, with the updated L1 menu:
START42_V15A::All
and
MC_42_V15A::All
.
These global tags have the latest v6 L1 menu.
frozen - running the 2011 "3e33" menu
Studies for the 2011 "3e33" (and higher) menu should be done using these instructions.
Preparing a working area
Configure your new paths in your user area in ConfDB, using code template
CMSSW_4_2_0_HLT21
.
For reference, the master table is at
/dev/CMSSW_4_2_0/HLT
while the GRun subtable (pp collisions) is at
/dev/CMSSW_4_2_0/GRun
.
Make a developer area based on
CMSSW_4_2_7_hltpatch3
setenv SCRAM_ARCH slc5_amd64_gcc434
cmsrel CMSSW_4_2_7_hltpatch3
cd CMSSW_4_2_7_hltpatch3/src
cmsenv
cvs co -r V02-23-02 FastSimulation/Configuration
cvs co -r V11-11-03 HLTrigger/Configuration
checkdeps -a
scram b -j8
rehash
export SCRAM_ARCH=slc5_amd64_gcc434
cmsrel CMSSW_4_2_7_hltpatch3
cd CMSSW_4_2_7_hltpatch3/src
cmsenv
cvs co -r V02-23-02 FastSimulation/Configuration
cvs co -r V11-11-03 HLTrigger/Configuration
checkdeps -a
scram b -j8
hash -r
Dumping the latest configuration
- The current menu has the latest online global tag, so it should run on data with CMSSW 4.2.x out of the box. See below for running on montecarlo samples.
- In case you live on the bleeding edge and want to try the most-recent integrated pp table, dump it with:
hltGetConfiguration /dev/CMSSW_4_2_0/GRun --full --offline --data --unprescale --process TEST > hlt.py
- To run your own table containing only HLT paths (and eventually additional ESModules), using the services and event setup from the master configuration, use:
- extract the configurations:
edmConfigFromDB --cff --configName /dev/CMSSW_4_2_0/GRun --nopaths > setup_cff.py
hltGetConfiguration /your/test/menu/with/only/paths --full --offline --data --unprescale --process TEST > hlt.py
-
- add to
hlt.py
this line, just after process = cms.Process( "HLT" )
process.load("setup_cff")
Running the L1 emulator
The "3e33" menu uses a new L1 menu: GlobalTriggerMenu_L1Menu_Collisions2011_v5 .
To run on both data and MC, you need to (re)run the L1 GT emulator:
- tell
hltGetConfiguration
to rerun the L1 emulator, and pass the new L1 menu
hltGetConfiguration ... --l1-emulator --l1 L1GtTriggerMenu_L1Menu_Collisions2011_v5_mc
- tell
hltIntegrationTests
to rerun the L1 emulator, and pass the new L1 menu
hltIntegrationTests ... -x "--l1-emulator" -x "--l1 L1GtTriggerMenu_L1Menu_Collisions2011_v5_mc"
Note that running the L1 emulator is
not needed if you run over recent data (run
173236 or later).
Running on MC samples
To create python dumps suitable for running on MC, use the
--mc
option in place of
--data
with
hltGetConfiguration
.
To run on MC, you should use the "startup" or "mc" GlobalTags, with the updated L1 menu:
START42_V13B::All
and
MC_42_V13B::All
.
frozen - running the 2011 "2e33" menu
Studies for the 2011 "2e33" (and higher) menu should be done using these instructions.
Preparing a working area
Configure your new paths in your user area in ConfDB, using code template
CMSSW_4_2_0_HLT18
.
For reference, the master table is at
/dev/CMSSW_4_2_0/HLT/V666
while the GRun subtable (pp collisions) is at
/dev/CMSSW_4_2_0/GRun/V182
.
Make a developer area based on
CMSSW_4_2_7_hltpatch1
setenv SCRAM_ARCH slc5_amd64_gcc434
cmsrel CMSSW_4_2_7_hltpatch1
cd CMSSW_4_2_7_hltpatch1/src
cmsenv
cvs co -r V11-09-64 HLTrigger/Configuration
checkdeps -a
scram b
rehash
export SCRAM_ARCH=slc5_amd64_gcc434
cmsrel CMSSW_4_2_7_hltpatch1
cd CMSSW_4_2_7_hltpatch1/src
cmsenv
cvs co -r V11-09-64 HLTrigger/Configuration
checkdeps -a
scram b
Dumping the latest configuration
- The current menu has the latest online global tag, so it should run on data with CMSSW 4.2.x out of the box. See below for running on montecarlo samples.
- In case you live on the bleeding edge and want to try the most-recent integrated pp table, dump it with:
hltGetConfiguration /dev/CMSSW_4_2_0/GRun/V182 --full --offline --data --unprescale --process TEST > hlt.py
- To run your own table containing only HLT paths (and eventually additional ESModules), using the services and event setup from the master configuration, use:
- extract the configurations:
edmConfigFromDB --cff --configName /dev/CMSSW_4_2_0/GRun/V182 --nopaths > setup_cff.py
hltGetConfiguration /your/test/menu/with/only/paths --full --offline --data --unprescale --process TEST > hlt.py
-
- add to
hlt.py
this line, just after process = cms.Process( "HLT" )
process.load("setup_cff")
Running the L1 emulator
The "2e33" menu uses a new L1 menu: GlobalTriggerMenu_L1Menu_Collisions2011_v4 .
If you run the HLT on data taken after the July technical stop (run > 170000), you can use the data directly.
For older runs, or for MC samples, you need to (re)run the L1 GT emulator:
- tell
hltGetConfiguration
to rerun the L1 emulator, and pass the new L1 menu
hltGetConfiguration ... --l1-emulator --l1 L1GtTriggerMenu_L1Menu_Collisions2011_v4_mc
- tell
hltIntegrationTests
to rerun the L1 emulator, and pass the new L1 menu
hltIntegrationTests ... -x "--l1-emulator" -x "--l1 L1GtTriggerMenu_L1Menu_Collisions2011_v4_mc"
Running on MC samples
To create python dumps suitable for running on MC, use the
--mc
option in place of
--data
with
hltGetConfiguration
.
To run on MC, you should use the "startup" GlobalTag, which has the updated L1 menu:
START42_V13A::All
.
frozen - running the 2011 "1.4e33" menu
This is a frozen recipe for running the "1.4e33" menu.
Dumping the latest configuration
Configure your new paths in your user area in ConfDB, using code template
CMSSW_4_2_0_HLT8
.
For reference, the master table is at
/dev/CMSSW_4_2_0/HLT/V480
while the GRun subtable (pp collisions) is at
/dev/CMSSW_4_2_0/GRun/V137
.
Use
AK5
corrected jets - see the example jet paths in the master table.
- make a developer area based on CMSSW_4_2_4_hltpatch1
setenv SCRAM_ARCH slc5_amd64_gcc434
cmsrel CMSSW_4_2_4_hltpatch1
cd CMSSW_4_2_4_hltpatch1/src
cmsenv
- If you use
tcsh
, to make sure you can find the newly compiled scripts, run the command
rehash
- The current menu has the latest online global tag, so it should run on data with CMSSW 4.2.x out of the box. See below for running on montecarlo samples.
- In case you live on the bleeding edge and want to try the most-recent integrated pp table, dump it with:
hltGetConfiguration /dev/CMSSW_4_2_0/GRun/V137 --full --offline --data --unprescale --process TEST > hlt.py
- To run your own table containing only HLT paths (and eventually additional ESModules), using the services and event setup from the master configuration, use:
- extract the configurations:
edmConfigFromDB --cff --configName /dev/CMSSW_4_2_0/GRun/V137 --nopaths > setup_cff.py
hltGetConfiguration /your/test/menu/with/only/paths --full --offline --data --unprescale --process TEST > hlt.py
-
- add to
hlt.py
this line, just after process = cms.Process( "HLT" )
process.load("setup_cff")
Running on data
The "1e33" menu that we plan to deploy after the May 9th technical stop foresees an updated readout configuration for the HF.
The recent runs
163591
,
163592
and
163593
were taken with the HF in such configuration (and a lower-than-usual prescale on L1_SingleEG5, so it will have a higher rate than expected online).
You should also change the Global Tag to: GR_R_42_V12.
Running on MC samples
More details to be added.
To run on MC, you should use the corresponding GlobalTags:
START42_V11::All
or
MC_42_V11::All
.
frozen - running the 2011 "1e33" v1.x menu with CMSSW 4.2.3-patch2
This is a frozen recipe for running the "1e33" v1.x menu.
CMSSW environment
- make a developer area based on CMSSW_4_2_3_patch2
setenv SCRAM_ARCH slc5_amd64_gcc434
cmsrel CMSSW_4_2_3_patch2
cd CMSSW_4_2_3_patch2/src
cmsenv
cvs co -r V01-19-68-14 Configuration/PyReleaseValidation
cvs co -r V01-14-05-06 EventFilter/EcalRawToDigi
cvs co -r V01-22-09 FastSimulation/Configuration
cvs co -r V04-09-03 FastSimulation/HighLevelTrigger
cvs co -r V11-06-43 HLTrigger/Configuration
cvs co -r V00-08-02 HLTrigger/Egamma
cvs co -r V01-22-02 HLTrigger/HLTfilters
cvs co -r V00-02-02 HLTrigger/JetMET
cvs co -r V02-11-02 HLTrigger/Muon
cvs co -r V03-02-00 HLTrigger/btau
cvs co -r V01-21-02 HLTrigger/special
cvs co -r V01-04-06-04 RecoBTag/ImpactParameter
cvs co -r V00-13-10-00 RecoLocalCalo/EcalRecAlgos
cvs co -r V00-09-12-51 RecoLocalCalo/HcalRecProducers
cvs co -r V01-18-03-01 RecoMuon/MuonIdentification
cvs co -r V08-12-08-00 RecoTracker/MeasurementDet
checkdeps -a
scram b
HLT menu in ConfDB
Use the offline version:
hltGetConfiguration --full --offline --data /online/collisions/2011/1e33/v1.3/HLT --process TEST
Or extract it directly from a give run number:
hltGetConfiguration --full --offline --data run:165205 --process TEST
For re-running on data taken with this menu, the correct GlobalTag i already included in the menu (
GR_H_V20::All
).
For re-running on older data, please use
GR_R_42_V14::All
.
For running on MC, please use
START42_V12::All
("startup" conditions) or
MC_42_V12::All
("mc" conditions), which include the 4.2.x jet energy corrections.
CMSSW_7_4_X
CMSSW_7_4_0_pre6
Run-2 development menus (GRun for pp; simplistic dummy menus for HIon and PIon)
Note: the CMSSW_7_4_0_pre6 recipe posted here is frozen. For the most recent developments, use the above integration build.
setenv SCRAM_ARCH slc6_amd64_gcc491
cmsrel CMSSW_7_4_0_pre6
cd CMSSW_7_4_0_pre6/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build -j 4
cd HLTrigger/Configuration/test
CMSSW_7_3_X
CMSSW_7_3_2_patch1
Run-2 development menus (GRun for pp; simplistic dummy menus for HIon and PIon)
Note: the CMSSW_7_3_2_patch1 recipe posted here is frozen. For the most recent developments, use the above integration build.
setenv SCRAM_ARCH slc6_amd64_gcc491
cmsrel CMSSW_7_3_2_patch1
cd CMSSW_7_3_2_patch1/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build -j 4
cd HLTrigger/Configuration/test
CMSSW_7_2_X
CMSSW_7_2_3_patch1
Run-2 development menus (GRun for pp; simplistic dummy menus for HIon and PIon)
setenv SCRAM_ARCH slc6_amd64_gcc481
cmsrel CMSSW_7_2_3_patch1
cd CMSSW_7_2_3_patch1/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build -j 4
cd HLTrigger/Configuration/test
CMSSW_7_1_X
CMSSW_7_1_14
2013 PIon, 2012 GRun, and 2011 HIon, HLT menus
setenv SCRAM_ARCH slc6_amd64_gcc481
cmsrel CMSSW_7_1_14
cd CMSSW_7_1_14/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build -j 4
cd HLTrigger/Configuration/test
CMSSW_7_0_X
CMSSW_7_0_9_patch3
2013 PIon, 2012 GRun, and 2011 HIon, HLT menus
setenv SCRAM_ARCH slc5_amd64_gcc481
cmsrel CMSSW_7_0_9_patch3
cd CMSSW_7_0_9_patch3/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build -j 4
cd HLTrigger/Configuration/test
CMSSW_6_2_X
CMSSW_6_2_12_patch1
2013 PIon, 2012 GRun, and 2011 HIon, HLT menus
setenv SCRAM_ARCH slc5_amd64_gcc472
cmsrel CMSSW_6_2_12_patch1
cd CMSSW_6_2_12_patch1/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build -j 4
cd HLTrigger/Configuration/test
CMSSW_5_3_X (Legacy release for Run-I data taking period - 2011/2012/2013)
CMSSW_5_3_26
2013 PIon, 2012 GRun, 2011 HIon, and 2011 resurrected HLT menus
setenv SCRAM_ARCH slc6_amd64_gcc472
cmsrel CMSSW_5_3_26
cd CMSSW_5_3_26/src
cmsenv
git cms-addpkg HLTrigger/Configuration
git cms-checkdeps -a
scram build -j 4
cd HLTrigger/Configuration/test
CMSSW_5_2_X (2012 pp / 2013 pPb running)
CMSSW_5_2_9
2013 PIon, 2012 GRun, 2011 HIon, menus
setenv SCRAM_ARCH slc5_amd64_gcc462
cmsrel CMSSW_5_2_9
cd CMSSW_5_2_9/src
cmsenv
addpkg HLTrigger/Configuration
scram build
cd HLTrigger/Configuration/test
CMSSW_4_4_X (2011 PbPb running)
CMSSW_4_4_7
5E33 HLT menu (V2) and HIon menu as deployed online in 2011
setenv SCRAM_ARCH slc5_amd64_gcc434 #44X only available with AMD64
cmsrel CMSSW_4_4_7
cd CMSSW_4_4_7/src
cmsenv
git cms-addpkg FastSimulation/Configuration
git cms-addpkg HLTrigger/Configuration
scram build
cd HLTrigger/Configuration/test
CMSSW_4_2_X (2011 pp running)
CMSSW_4_2_10
5E33 HLT menu (V2) and HIon menus
setenv SCRAM_ARCH slc5_amd64_gcc434 #42X only available with AMD64
cmsrel CMSSW_4_2_10
cd CMSSW_4_2_10/src
cmsenv
git cms-addpkg HLTrigger/Configuration
scram build
cd HLTrigger/Configuration/test
CMSSW_4_2_X_ONLINE_2011-11-02-1600
Online development series: 5E33 HLT menu (V2) and HIon menu
cmsrel CMSSW_4_2_X_ONLINE_2011-11-02-1600
cd CMSSW_4_2_X_ONLINE_2011-11-02-1600/src
cmsenv
cvs co -r V12-03-35 HLTrigger/Configuration
checkdeps -a
scram build
cd HLTrigger/Configuration/test
CMSSW_4_2_9_HLT3_hltpatch3
5E33 HLT menu
Note: the CMSSW_4_2_9_HLT3_hltpatch3 recipe posted here is frozen. For the most recent developments, use the above integration build.
setenv SCRAM_ARCH slc5_amd64_gcc434 # 42X only available with AMD64
cmsrel CMSSW_4_2_9_HLT3_hltpatch3
cd CMSSW_4_2_9_HLT3_hltpatch3/src
cmsenv
addpkg HLTrigger/Configuration
scram build
cd HLTrigger/Configuration/test
CMSSW_4_2_9_HLT3
3E33 HLT menu
Note: the CMSSW_4_2_9_HLT3 recipe posted here is frozen. For the most recent developments, use the above integration build.
setenv SCRAM_ARCH slc5_amd64_gcc434 # 42X only available with AMD64
cmsrel CMSSW_4_2_9_HLT3
cd CMSSW_4_2_9_HLT3/src
cmsenv
addpkg HLTrigger/Configuration
scram build
cd HLTrigger/Configuration/test
CMSSW_4_2_8_patch7
5E32 HLT menu (default 42X HLT menu)
Note: the CMSSW_4_2_8_patch7 recipe posted here is frozen. For the most recent developments, use the above integration build.
setenv SCRAM_ARCH slc5_amd64_gcc434 # 42X only available with AMD64
cmsrel CMSSW_4_2_8_patch7
cd CMSSW_4_2_8_patch7/src
cmsenv
addpkg HLTrigger/Configuration
scram build
cd HLTrigger/Configuration/test
Responsible:
MartinGrunewald