--
DoujaDarej - 2020-08-31
Introduction
This page is meant to document the different tools and tricks I run into so that it's well organised and helps to solve the next problem in the least possible time. It can go from setup codes, config files, terminal tricks...
Data Formats:
RECO :
RECO format is common to data and MC. It has the biggest collections ( for examples for tracking, there are the digits and the hits ). You can run the reconstruction step on a sample that is already reconstructed. Just make sure you change the name of the process once your config file is created with cmsDriver command.
In cmsDriver, reconstruction is usually run in step3.
To have a RECO output, one needs to specify in the configuration file :
process.RECOSIMoutput = cms.OutputModule("PoolOutputModule",
dataset = cms.untracked.PSet(
dataTier = cms.untracked.string(''),
filterName = cms.untracked.string('')
),
fileName = cms.untracked.string('reReco_trackingOnly.root'),
outputCommands = process.RECOSIMEventContent.outputCommands,
splitLevel = cms.untracked.int32(0)
)
and add in the path and
EndPath definitions :
process.RECOSIMoutput_step = cms.EndPath(process.RECOSIMoutput)
and in the Schedule definition after
endjob
process.RECOSIMoutput_step
AOD :
AOD contains a bit less information than RECO, but can be compressed, and we can remove non necessary information using keep and drop. for example by adding :
process.AODSIMoutput = cms.OutputModule("PoolOutputModule",
compressionAlgorithm = cms.untracked.string('LZMA'),
compressionLevel = cms.untracked.int32(4),
dataset = cms.untracked.PSet(
filtername = cms.untracked.string(''),
dataTier = cms.untracked.string('AODSIM')
),
eventAutoFlushCompressedSize = cms.untracked.int32(15728640),
fileName = cms.untracked.string('reReco_AODSIM.root'),
outputCommands = process.AODSIMEventContent.outputCommands
)
and add in the path and Endpath definitions and Schedule definitions the same way than for RECO
MINIAOD :
A lot skimmed compared to AOD, could be useless in the case of tracks and vertices. MINIAOD production can be done using:
process.MINIAODSIMoutput = cms.OutputModule("PoolOutputModule",
compressionAlgorithm = cms.untracked.string('LZMA'),
compressionLevel = cms.untracked.int32(4),
dataset = cms.untracked.PSet(
filtername = cms.untracked.string(''),
dataTier = cms.untracked.string('MINIAODSIM')
),
eventAutoFlushCompressedSize = cms.untracked.int32(15728640),
fileName = cms.untracked.string('reReco_MINIAODSIM.root'),
outputCommands = process.MINIAODSIMEventContent.outputCommands
)
and add in the path and Endpath definitions and Schedule definitions the same way than for RECO
Using keep and drop :
In the case we want to keep only V0 vertices and generalTrack tracks that come from a reRECO process (reconstruction of data in a RECO format ) we can add for example in the AOD format :
process.AODSIMoutput = cms.OutputModule("PoolOutputModule",
compressionAlgorithm = cms.untracked.string('LZMA'),
compressionLevel = cms.untracked.int32(4),
dataset = cms.untracked.PSet(
filtername = cms.untracked.string(''),
dataTier = cms.untracked.string('AODSIM')
),
eventAutoFlushCompressedSize = cms.untracked.int32(15728640),
fileName = cms.untracked.string('reReco_AODSIM_skimmed.root'),
outputCommands = cms.untracked.vstring(
'drop *',
'keep *VertexCompositeCandidates*_*_*_*reRECO*',
'keep *recoTracks*_*generalTracks*_*_*reRECO*',
)
)
Config files for reconstruction of different samples :
Data, highMET
Full examples of codes used to rereconstruct
HighMET data can be found in :
/opt/sbg/cms/ui3_data1/ddarej/reReco_METData_SST/CMSSW_9_4_5/src/reco_AODsim
Madgraph and Madanalysis
Madgraph in the context of RPV-susy
Instructions :
In a new directory
git clone
https://github.com/enibigir/flyingtop
source setup.sh
python generate.py --process name_of_process.txt -o output_file
The different processes here are :
- process_fullchain.txt : Used to generate this whole specific chanel. Mainly used for kinematic studies, more particularly when we are interested in the pt, eta and phi of the different particles, or to determine the beta and the gamma of the long lived particle (neutralino). Will probably be used to generate events later. It is the one which takes the most time
import model rpvmssm-lpp312 --modelname
generate p p > sl2- sl2+ / h01 h02 a0, (sl2- > mu- n2, n2 > t d s), (sl2+ > mu+ n2, n2 > t d s) @1
add process p p > sl2- sl2+ / h01 h02 a0, (sl2- > mu- n2, n2 > t~ d~ s~), (sl2+ > mu+ n2, n2 > t d s) @2
add process p p > sl2- sl2+ / h01 h02 a0, (sl2- > mu- n2, n2 > t d s), (sl2+ > mu+ n2, n2 > t~ d~ s~) @3
add process p p > sl2- sl2+ / h01 h02 a0, (sl2- > mu- n2, n2 > t~ d~ s~), (sl2+ > mu+ n2, n2 > t~ d~ s~) @4
- process_prod.txt : Used to generate the smuon production events. Useful to determine the cross section of smuon production which is a limitation that needs to be taken care of when determining the benchmarks.
import model rpvmssm-lpp312 --modelname
generate p p > sl2- sl2+ / h01 h02 a0
- process_prodneutralino.txt : Used to generate the neutralino direct production (without going through the smuon production).
import model rpvmssm-lpp312 --modelname
generate p p > n2 n2
- process_decay.txt : Used to generate the decay of the neutralino events. This enables to determine the lifetime of the neutralino.
import model rpvmssm-lpp312 --modelname
generate n2 > t d s @1
add process n2 > t~ d~ s~ @2
Change the parameters in cards/param_card.dat.
We put all the masses to very high values (e+12 for all the masses of the susy particle) with an exception to the particles that are involved in the process. In the case of RPV-susy, since the full chain process is p p > sl2- sl2+ / h01 h02 a0, (sl2- > mu- n2, n2 > t d s), (sl2+ > mu+ n2, n2 > t d s), keep the sl2 (smuon), n2 (neutralino), and su3 (intermediary stop) to low masses. Putting all the rest of the masses to high values is meant to avoid taking into consideration other processes in the calculation of the cross section by MG5.
python generate.py --events generate_events.txt -d output_file
if we want to scan over a one mass (scan.txt) or on two masses (scan2D.txt) use :
python generate.py --events generate_events.txt -d output_file -s scan.txt