Use Onia2MuMu package for Polarization Studies

Introduction

This TWiki aims to give an introduction and tutorial on how to use the modifed version of the official Onia2MuMu package used for J/psi polarization studies. In principal one would have to deal with CMSSW module HeavyFlavorAnalysis/Onia2MuMu. It consist of 2 major components:
  1. Onia2MuMuPAT (files Onia2MuMuPAT.h, Onia2MuMuPAT.cc , onia2MuMuPAT_cfi.py ). More help can be found at Onia2MuMuPAT TWiki.
  2. JPsiAnalyzerPAT (files JPsiAnalyzer.cc , jpsianalyzerpat_cfg.py ). The unofficial Version of the Analyzer, which is used for the polarization studies, is a modified version of the official one (files JPsiAnalyzerPAT.cc, jpsianalyzerpat_cfg_DATA.py )

For the polarization analysis we need to apply Steps 3 - 4 to several datasets (PAT tuples): (DBS instance cms_dbs_analysis_02)

  1. real data
    • 3.1 pb-1: /MuOnia/zgecse-Run2010A-PromptReco-v4-Onia2MuMu-v5-v3-af687901d6eb0367c9b0c5fdf9b70ada/USER
  2. prompt J/psi MC
  3. non-prompt J/psi MC:
    • B0: /B0ToJPsiMuMu_2MuPEtaFilter_7TeV-pythia6-evtgen/fat-Summer10-START36_V9_S09-v1-Onia2MuMu-v5-bef246ddbd6b4e7664a45838bf80a640/USER
    • B+: /BpToJPsiMuMu_2MuPEtaFilter_7TeV-pythia6-evtgen/fat-Summer10-START36_V9_S09-v1-Onia2MuMu-v5-bef246ddbd6b4e7664a45838bf80a640/USER
    • Bs: /BsToJPsiMuMu_2MuPEtaFilter_7TeV-pythia6-evtgen/fat-Spring10-START3X_V26_S09-v1-Onia2MuMu-v3-fd52975fb049c2c1daf70ca039062259/USER
    • LambdaB: /LambdaBToJPsiMuMu_2MuPEtaFilter_7TeV-pythia6-evtgen/fat-Spring10-START3X_V26_S09-v1-Onia2MuMu-v3-fd52975fb049c2c1daf70ca039062259/USER

Step 1: Install the CMSSW framework:

To work with the CMSSW module HeavyFlavorAnalysis/Onia2MuMu one has to install a version of CMSSW and the Onia2MuMu package itself. For the current used version please check which tags are used. This Twiki uses the current (28thSep2010) working tag V00-10-00 (for CMSSW_3_8_1 ; it requires you to check out also this tag MuonAnalysis/MuonAssociators V01-06-00 to do the association to L1 objects)

cmsrel CMSSW_3_8_1
cd CMSSW_3_8_1/src
cmsenv
addpkg HeavyFlavorAnalysis/Onia2MuMu V00-11-00
addpkg MuonAnalysis/MuonAssociators V01-10-00
scramv1 b -j4

(NOTE: It could be that the package cannot compile, especially when using 3_8_1. You would have to change some include paths for the right header information)

There exist 5 working tags for the Onia2MuMu package. Currently we are using tag v5

  • Onia2MuMu-v5 :
   CMSSW_3_8_1
   HeavyFlavorAnalysis/Onia2MuMu V00-11-00
   MuonAnalysis/MuonAssociators V01-10-00

Step 2: Install the modified version of the JPsiAnalyzerPAT

The official version of the JPsiAnalyzerPAT has to be replaced with with the modified version. Then you will need to re-compile.
cd HeavyFlavorAnalysis/Onia2MuMu/src
cvs co -p -r1.7 UserCode/FloTei/Onia2MuMu/src/JPsiAnalyzerPAT.cc > JPsiAnalyzerPAT.cc
scramv1 b -j4
The JPsiAnalyzerPAT will produce two output files when using jpsianalyzerpat_cfg_DATA.py.
dataSetName = cms.string("RooDataSet_pol_Run2010A-PromptReco-v4_data.root"),
dataSetNameReco=cms.string("RooDataSet_Run2010A-PromptReco-v4_dataR.root"),
   
The difference between the two is that the data file will include all information for every event even if there was no RECO/PAT dimuon found. The purpose is to add MC information of all generated events, even if they are not reconstructed, which is needed for the acceptance calculation. This is, of course, not relevant when real data are processed and the corresponding root file can be discarded or in order to flag events without a RECO/PAT dimuon, the user should check whether the mass is JpsiMass < 0. The ROOT file having the dataR in the name contains just events and variables which are filled when a RECO/PAT dimuon was found. dataR is a subsample of data, so to say.

Step 3: Start a CRAB job and merge the output TTrees:

To run with the JPsiAnalzer over real data you can use jpsianalyzerpat_cfg_DATA.py and the CRAB cfg.
Please note that for running over a MC sample you will have to adapt the jpsianalyzerpat_cfg_DATA.py as follows:
isMC = cms.untracked.bool(True),
isPromptMC = cms.untracked.bool(False), #or True, if sample contains prompt J/psis
  

  1. Check out and adapt the corresponding CRAB cfg:
         cd /path/to/CMSSW_3_8_1/src/HeavyFlavorAnalysis/Onia2MuMu/test/
         cvs co -p UserCode/FloTei/Onia2MuMu/test/crab_Onia2MuMu-v5_Run2010A-PromptReco-v4_v3_RooDataSet.cfg > crab_Onia2MuMu-v5_Run2010A-PromptReco-v4_v3_RooDataSet.cfg
         
[CMSSW]
lumi_mask = #give path to the certified JSON file
dbs_url = #list DBS instance
datasetpath = #list the PAT tuple DataSetPath (as given in DBS)
pset = /path/to/jpsianalyzerpat_cfg_DATA.py
total_number_of_lumis = #total number of processed lumi segments ( -1 = process all lumis ) 
lumis_per_job = #split in different jobs, each job to XXX (eg. 500) lumis 
output_file = #give the name of root output files, seperated by a comma if more than one
[USER]
return_data = 0 #or 1 (0 == NO, 1 == YES) 
copy_data =  0 #or 1 (0 == NO, 1 == YES) 
#storage_element = choose your T2 storage element, if needed
publish_data= 0 #or 1, publish in DBS (0 == NO, 1 == YES) 
   
Note that for the 3.1 pb-1 sample one can use the following PAT tuple :
lumi_mask = /path/to/CMSSW_3_8_1/src/HeavyFlavorAnalysis/Onia2MuMu/certification/7TeV/Collisions10/StreamExpress/Cert_132440-144114_7TeV_StreamExpress_Collisions10_JSON_BPAG.txt
datasetpath = /MuOnia/zgecse-Run2010A-PromptReco-v4-Onia2MuMu-v5-v3-af687901d6eb0367c9b0c5fdf9b70ada/USER
   
and certified JSON file. This JSON file is of course just valid for the PAT tuple used in that example.
cd /path/to/CMSSW_3_8_1/src/
cvs co HeavyFlavorAnalysis/Onia2MuMu/certification/7TeV/Collisions10/StreamExpress/Cert_132440-144114_7TeV_StreamExpress_Collisions10_JSON_BPAG.txt
   

  1. Source the CRAB environment:
         source /afs/cern.ch/cms/LCG/LCG-2/UI/cms_ui_env.csh
         source /afs/cern.ch/cms/ccs/wm/scripts/Crab/crab.csh
         
  2. Run CRAB and retrieve the output files:
         crab -cfg crab_Onia2MuMu-v5_Run2010A-PromptReco-v4_v3_RooDataSet.cfg -create 
         crab -c YOUR_CRAB_DIR -submit 
         
    and check the status of the jobs
         crab -c YOUR_CRAB_DIR -status
         crab -getoutput  -c YOUR_CRAB_DIR 
         
    After all your CRAB jobs are finished successfully please retrieve the output JSON file, which is then used (Step 4) to calculate the Luminosity processed by your CRAB jobs:
    crab -c YOUR_CRAB_DIR -report
         
  3. Check-out the python script to merge the output of the CRAB job to ONE file
         cd /path/to/CMSSW_3_8_1/src/HeavyFlavorAnalysis/Onia2MuMu/test/
         cvs co -p UserCode/FloTei/macros/treeMerger.py > treeMerger.py
         
  4. Adapt treeMerger.py as follows:
pathOnFs = "your/path/to/crab/output/files/"
#list all the names of your rootfiles.root
files = [ 
craboutput_1.root
craboutput_2.root
... ]
outfile = "give_a_name_to_the_output.root"
## choose Name of TTree in input files
dsName = "data"
#dsName = "recoData"
## choose Name of TCHAIN, should be the same as input TTree name
chain = R.TChain("data","data")
#chain = R.TChain("recoData","recoData")
   

Step 4: Calculate the Lumi being processed and register TTree at the CMS QTF workspace:

All the relevant information concerning your produced TTree should be registered at the CMS QTF workspace under the TTree section.

a) Install lumiCalc.py:

To calculate the processed Lumi you will have to deal with lumiCalc.py.
  scramv1 p CMSSW CMSSW_3_7_0
  cd CMSSW_3_7_0/src
  cvs co -r lumi2010-Sep21b RecoLuminosity/LumiDB
  cd RecoLuminosity/LumiDB
  scramv1 b
  cmsenv

b) Calculate the integrated Luminosity

Get the output JSON file (as mentioned in Step 3) of the jobs which succeeded ( is the input of lumicalc.py) and the Nr. of recorded events.
 crab -c YOUR_CRAB_DIR -report
This will produce another JSON file based on the information of the processed CRAB jobs, normally it can be found under YOUR_CRAB_DIR/res/lumiSummary.json To get and overview on delivered, recorded and total luminosity you will have to use lumicalc.py and the ouput JSON.
 cd /path/to/CMSSW_3_7_0/src/RecoLuminosity/LumiDB/scripts 
 lumiCalc.py -c frontier://LumiProd/CMS_LUMI_PROD -i YOUR_CRAB_DIR/res/lumiSummary.json overview

Please note that if you want to make his root files available to the group, ALSO upload the obtained output lumiSummary.json file to the CMS QTF Workspace - Section TTrees.

Produce PAT tuples with the Onia2MuMu package

In the case you would have to create PAT tuples for further processing two file will become important:
  1. onia2MuMuPATSummer10_cfg.py
  2. onia2MuMuPAT_cff.py

onia2MuMuPATSummer10_cfg.py is the config file to run cmsRun. It imports onia2MuMuPAT_cff.py which loads the module Onia2MuMuPAT.cc

a) Adapt configuration files

Normally you can use onia2MuMuPATSummer10_cfg.py out of the box but for some reason you might want to change the parameters GlobalTag, MC, HLT, Filter.
onia2MuMuPAT(process, GlobalTag="START38_V8::All", MC=True, HLT="REDIGI36X", Filter=False)
   

In the standard procedure the generetor information will be dropped out of the final PAT tuples. As we have to generate Acceptance Maps we do need this information thus we should keep it: add the generator information

'keep recoGenParticles_genParticles__*' 
to the PAT configuration file as follows
# output
process.out = cms.OutputModule("PoolOutputModule",
  ..... 
  'keep l1extraL1MuonParticles_l1extraParticles_*_*',    # L1 info (cheap)
  'keep recoGenParticles_genParticles__*',               ## GEN information    
  ),
   

b) Start CRAB job

Setup your CRAB area as shown in Step 3. You can use the onia2MuMuPATData.crab, which comes with the package. Set your preferred parameters, such as the correct pset ( onia2MuMuPATSummer10_cfg.py ) and so on.
   vi onia2MuMuPATData.crab 
   crab -create -cfg onia2MuMuPATData.crab
   crab -submit -c onia2MuMuPATData
   crab -status -c onia2MuMuPATData
   crab -getoutput -c onia2MuMuPATData
   crab -status -c onia2MuMuPATData
   crab -publish -c onia2MuMuPATData

  • make sure your publication name contains Onia2MuMu-vX, where X corresponds to the Tag list above
  • do not make too many jobs as it will result in too many small output files, ideally the output files should be ~1GB
  • do NOT publish until running is complete (i.e. you get 100% running efficiency)
  • remember to check the status after getoutput to check that all exit codes are 0



HLT Studies of new "Quarkonia" Trigger paths

Introduction

This Twiki page presents our studies on HLT Quarkonia Trigger paths The MC samples are generated und the following conditions:

  • CMSSW 3_4_1,
  • global tag : MC_3XY_V16,
  • 1E31 trigger menu

(Filter at generator level: ONE muon must have a total momentum > 2.5 GeV/c) additional information can be found under: /afs/cern.ch/user/h/hwoehri/public/forFlorian

Pythia manual: http://home.thep.lu.se/~torbjorn/Pythia.html

Step 1: Generation - Simulation - RECO:

CVS check out relevant code:

cvs co -r CMSSW_3_4_1 HLTrigger/Configuration/python/HLTrigger_EventContent_cff.py
cvs co -r CMSSW_3_4_1 HLTrigger/Configuration/python/HLT_1E31_cff.py
cvs co -r CMSSW_3_4_1 HLTrigger/Configuration/python/HLT_8E29_cff.py
cvs co Configuration/GenProduction/python/PythiaUESettings_cfi.py
cvs co Configuration/GenProduction/python/PYTHIA6_JPsiWithFSR_7TeV_cff.py
cvs co Configuration/GenProduction/python/HERWIGPP_custom.py
scramv1 b

a) Build PYTHIA config files for MC generation - Proof it

cmsDriver.py Configuration/GenProduction/python/PYTHIA6_JPsiWithFSR_7TeV_cff.py -s GEN:ProductionFilterSequence --conditions Frontier
Conditions_GlobalTag,MC_3XY_V16::All --eventcontent FEVT -n 10000 --no_exec --customise=Configuration/GenProduction/HERWIGPP_custom.py

which gives the following output file PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_MC.py

edit the script PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_MC.py :

  • remove the mumugenfilter
  • adjust the parameters for the mugenfilter

cmsRun PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_MC.py

  • get the filterEfficiency: divide the number of entries (1916) of the ROOT sample by the number of generated events (10.000) to get the FilterEfficiency.
       root PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN.root
       Events->GetEntries()
       

  • get the "crossSection" by converting the PYTHIA cross-section ("sigma") from [mb] --> [pb] and apply the BR for J/psi --> mu+mu-

b) Build PYTHIA config files for GEN-SIM-RAW-RECO - Submit CRAB job

cmsDriver.py Configuration/GenProduction/python/PYTHIA6_JPsiWithFSR_7TeV_cff.py -s GEN:ProductionFilterSequence,SIM,DIGI,L1,DIGI2RAW,
RAW2DIGI,L1Reco,RECO --conditions FrontierConditions_GlobalTag,MC_3XY_V16::All --datatier GEN-SIM-RAW-RECO --eventcontent FEVTSIM -n 1000 --no_exec --customise=Configuration/GenProduction/HERWIGPP_custom.py

which gives the following output file PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_SIM_DIGI_L1_DIGI2RAW_RAW2DIGI_L1Reco_RECO_MC.py

  • update in this script the values stored in "filterEfficiency" and "crossSection"
  • remove the mumugenfilter
  • adjust the parameters for the mugenfilter

you can use the CRAB file for the generation: crab.cfg
Note that if we want to have 1 Mio. events AFTER the filter, the crab job should generate 1 Mio. / (filterEfficiency) ~ 5 Mio. events!

upon successful generation you might want to publish your data set and acknowledge it on: https://twiki.cern.ch/twiki/bin/viewauth/CMS/Onia2MuMuSamples

Step 2: Run HLT - Append Onia2MuMuPAT producer

a) Create HLT configuration script

cmsDriver.py PYTHIA6_JPsiWithFSR_MC_3XY_V16_1E31 -s HLT -n -1 --conditions MC_3XY_V16::All --no_exec --python_filename PYTHIA6_JPsiWi
thFSR_MC_3XY_V16_1E31_HLT.py --filein=PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_SIM_DIGI_L1_DIGI2RAW_RAW2DIGI_L1Reco_RECO.root

--> check if you need to add a datatier (or event content) to store GEN, RECO and HLT...

b) Insert new HL Trigger paths

you will need to update the previous generated script with inserting 4 more triggers:

  • HLT_Mu3_8E29 --> copy the implementation from HLTrigger/Configuration/python/HLT_8E29_cff.py
  • HLT_Mu3 (similar to the HLT_Mu5 in the 1E31 menu)
  • HLT_Onia_8E29 --> see MuonTrack_quarkoniaHLTfilter_hlt_local.py
  • HLT_Onia --> use the HLT_Mu3 as you will have implemented for the 1E31 menu

after implementation it should look similar to PYTHIA6_JPsiWithFSR_MC_3XY_V16_1E31_HLT.py

... don't forget to compile:

scramv1 b

Note:
MuonTrack_quarkoniaHLTfilter_hlt_local.py is written for a CMSSW version before 3_4_1. You should be careful to the follwing InputTags:

process.SOMETHING = cms.EDFilter( "HLTLevel1GTSeed",
   ...
    L1UseL1TriggerObjectMaps = cms.bool( True ),
    L1NrBxInEvent = cms.int32( 5 ),
   ...
and
  process.hltOniaCkfTrackCandidates = cms.EDProducer( "CkfTrackCandidateMaker",
  ...
  maxNSeeds = cms.uint32( 100000 )
  ...

If they are missing insert them!

c) Setup Onia2MuMuPAT Producer

Rerun the HLT with the script created above on one individual file.
cmsRun PYTHIA6_JPsiWithFSR_MC_3XY_V16_1E31_HLT.py

Check that everything works on processing the obtained root file with a modified Onia2MuMuPAT producer:

cvs co HeavyFlavorAnalysis/Onia2MuMu

modify HeavyFlavorAnalysis/Onia2MuMu/python/onia2MuMuPAT_cff.py to check all the 4 new triggers (--> do we need to increase the size of the trigger array in the source code?)

scramv1 b

run the Onia2MuMuPAT with HeavyFlavorAnalysis/Onia2MuMu/test/onia2MuMuPAT_cfg.py:

cmsRun HeavyFlavorAnalysis/Onia2MuMu/test/onia2MuMuPAT_cfg.py

if the full chain works, process the full data set with CRAB by creating the following script:

cmsDriver.py PYTHIA6_JPsiWithFSR_MC_3XY_V16_1E31 -s HLT -n -1 --conditions MC_3XY_V16::All --no_exec --python_filename PYTHIA6_JPsiWi
thFSR_MC_3XY_V16_1E31_HLT.py --filein=PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_SIM_DIGI_L1_DIGI2RAW_RAW2DIGI_L1Reco_RECO.root --customise=HeavyFlavorAnalysis/Onia2MuMu/onia2MuMuPAT_cff.py

you can find another crab job in /afs/cern.ch/user/h/hwoehri/public/forFlorian

Following files have been used to produce the PAT tuple: .

Step 3: Setup JPsiAnalyzerPAT

jpsianalyzerpat_cfg.py is used to run the JPsiAnalyzerPAT analyzer which produces by default 2 root files:

a) Adapt JPsiAnalyzerPAT.cc

The 4 new Trigger paths appear now in the PAT-tuple and should be read with JPsiAnalyzerPAT. For this reason one has to do some changes.
cmsRun jpsianalyzerpat_cfg.py

Step 4: Approve Trigger Path at TSG

In order to get the trigger approved, we need to study two issues on the signal side. Ideally, we should give answers to these two questions by "pretending" to be in the 8E29 and also in the 1E31 menu, but the 1E31 menu is clearly the "target menu". For this reason we implement the HLT_Mu3 and our trigger in both menu versions (remember that the differences are the L1 trigger primitives). In the 1E31 menu we don't have the HLT_Mu3, but the HLT_Mu5 prescaled by a factor of 20...

a) Efficiency of the Signal

The "efficiency" with which we will collect our signal, i.e. what is the fraction of all J/psi's that we will trigger on with this trigger? --> this should be done differentially in pT(J/psi) and rapidity, y(J/psi) (NOT pseudo-rapidity). We should also provide a 2D efficiency histogram: pT(J/psi) vs eta(J/psi). (When you do that you should be careful with the error propagation: the event passing the trigger (--> nominator) also enters the denominator... You can use the TGraphAsymmErrors::BayesDivide() method to get the error propagation correctly done).

When we first did this study, we studied this trigger efficiency w.r.t to RECONSTRUCTED J/psi's, and not GENERATED J/psi's (not to mix into the trigger efficiency the effect of limited acceptance). I guess, we would need to have 3 sets of efficiency histograms. One for each of the possible J/psi categories, depending on whether the muons would be reconstructed as global or tracker muons (--> Jpsi categories: glb-glb, glb-trk, trk-trk). The Onia2MuMu(PAT) gives us already exclusive samples of global and tracker muons, so there shouldn't be any complication there...

aa) Plot basic Histograms

For the glb-glb, glb-trk and trk-trk muon combinations you will have to book 2 sets of histograms:

  • fill for all RECO J/psi's
  • fill with RECO J/psi's which have a match to the trigger objects.

Mind: do a matching to the generated muons as well (comparing RECO and GEN) to be sure that we are studying J/psi muons and not BG muons...

We should have:

  • a 1D histo versus pT
  • a 1D versus rapidity
  • a 2D: pT vs rap

So, in total you will have the following histograms:

const Int_t kNbTrig = 7; //HLT_Mu3_8E29, HLT_Mu3, HLT_Onia_8E29, HLT_Onia, HLT_Mu5, HLT_DoubleMu0, HLT_DoubleMu3
const Int_t kNbCat = 3; //glb-glb, glb-trk, trk-trk
const Int_t kNbSet = 2; //RECO, RECO+HLT
TH1F *hPt[kNbTrig][kNbCat][kNbSet];
TH1F *hRap[kNbTrig][kNbCat][kNbSet];
TH2F *hPt_Rap[kNbTrig][kNbCat][kNbSet];

Note: use unequal bins for the pT dimension, with finer segmentation at low pT.

These histograms should be saved in the file "Histos.root" for further processing.

ab) Plot the Signal Effeciency

In a second root-based macro you then loop over these sets of histograms and build the efficiency curves simply as:
TGraphAsymmErrors *gEffPt = new TGraphAsymmErrors();
gEffPt->BayesDivide(hPt[trig][cat][1], hPt[trig][cat][0]);
This method does not exist for 2D histograms. Also, the visualisation of error bars in 2D histograms is difficult, so, for simplicity I would simply get the 2D efficiency histogram by dividing one histo by the other one.

We should then compare the efficiency curves of the individual trigger paths. Ideally we should do this extracting from the 2D histogram various slices in rapidity: we expect, namely, to see an increased efficiency of our trigger w.r.t. DoubleMu0 / 3 at mid-rapidity. On the other hand, the overall efficiency should not be too different w.r.t. HLT_Mu3. (I realise that if we want to do slices in rapidity, we will need to follow the above prescription in parenthesis, otherwise we will not have the errors correctly propagated).

ac) Trigger Fraction

Last but not least, we also want to know, integrated over pT and eta, which fraction of RECONSTRUCTED J/psi's are also triggered in the individual paths. Do we keep 80% or 60% with our trigger? (So, we just need to divide the integrals of the two sets of histograms).

b) Overlap of other Triggers

What is the overlap of triggering on our signal with OHTER quarkonium or single muon triggers? This is why we want to also have the above triggers in our menu. We should see which fraction of reconstructed J/psi's (in the 3 categories) is are also triggered on by the HLT_DoubleMu0 and HLT_DoubleMu3 and HLT_Mu5 and which fraction of reconstructable events do we loose w.r.t. to the HLT_Mu3.

Book a second set of the 2D histograms: TH2F *hOverlap_Pt_Rap[kNbTrig][kNbCat]; Every time you see that our trigger is fired, fill the 2D histogram of our trigger with the HLT J/psi's pT and rapidity. Then, check if also the other triggers have fired. For the ones that did, fill also the corresponding histograms with the same HLT J/psi pT and rapidity cell. Save these histos again.

In an offline macro, you then simply need to divide these histograms w.r.t. the one of our trigger. Then you get for every cell in rap-pT the fraction of overlap. To get an integrated value you simply have to divide the total entries of the respective histograms.

Re - RECO J/Psi sample:

check out CMSSW_3_5_2 and then check out the following packages:
cvs co -r V00-03-04 RecoMuon/GlobalTrackingTools
cvs co -r V01-02-05 RecoMuon/L3MuonProducer
cvs co -r V01-00-09 RecoMuon/GlobalTrackFinder
CMSSW_3_5_4 and later releases require to check out
cvs co -r V01-04-00 MuonAnalysis/MuonAssociators  
to do the association to L1 objects.

Sample location

1) JPsi generation @ 7 TeV:
  • 1 muon must have a total momentum p > 2.5 GeV/c
  • CMSSW_3_4_1, MC_31XY_V16, 1M events
  • DBS location: /PYTHIA6_JPsiWithFSR_7TeV_test/fat-PYTHIA6_JPsiWithFSR_7TeV_test-4bb32f31359334a178699cf3ff9e123e/USER DBS instance: cms_dbs_ph_analysis_02
    filterEfficiency = 0.19, sigma*BR = 1218 nb
  • Onia2MuMuPAT tuples: to be created AFTER rerun HLT with new "Onia HLT paths"

Collision10 - OniaSkim

data (run>134987) use the prompt skim:

/MinimumBias/Commissioning10-CS_Onia-v9/USER

* x = 9 from run 133532 (CMSSW_3_5_7, details in this HN)
* x = 9 from run 134987 (CMSSW_3_5_8patch3, details in thisHN)

Reprocessing Date Release Global Tag Type Run Range JSON Validation Request/Ann Info
April 20th (data) 3_5_7 GR_R_35X_V7A::All prod 132440-133532 JSONApr20 here here
May 5th (data) 3_6_0_patch2 GR_R_36X_V7A::All pre-prod n/a   here here
May 6th (data) 3_5_8patch3 GR_R_35X_V8B::All prod 132440-134987 JASONMAy6th,cfg -- here
May 8th (data) 3_5_8patch3 GR_R_35X_V8B::All prod 2009 collisions -- -- here
May 27th(data) 361patch3 GR_R_36X_V11A::All pre-prod n/a -- here here
May XX([[][data]]) 361patch3 GR_R_36X_V11A::All prod -- -- -- here
May XX([[][data]]) 370 GR_R_37X_V5A::All pre-prod n/a -- -- here

-- FlorianTeischinger - Jan 2008

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng Barrel_wheels.png r1 manage 41.3 K 2008-08-18 - 18:43 FlorianTeischinger  
PNGpng DTTF_hwstatus_panel.png r1 manage 87.5 K 2008-11-14 - 15:33 FlorianTeischinger  
PNGpng DTTF_shifterpanel_outputTr.png r3 r2 r1 manage 13.5 K 2008-11-14 - 15:09 FlorianTeischinger  
PNGpng DTTF_shifterpanel_status.png r3 r2 r1 manage 78.0 K 2008-11-14 - 15:09 FlorianTeischinger  
JPEGjpg Dttf_BARRELstatus_panel.jpg r1 manage 85.1 K 2008-12-10 - 17:51 FlorianTeischinger  
JPEGjpg Dttf_HW_Status_PanelNEW.jpg r2 r1 manage 178.5 K 2008-12-10 - 18:08 FlorianTeischinger  
Texttxt MuonTrack_quarkoniaHLTfilter_hlt_local.py.txt r1 manage 10.8 K 2010-02-02 - 17:24 FlorianTeischinger  
Texttxt PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_MC.py.txt r1 manage 9.8 K 2010-02-02 - 17:23 FlorianTeischinger  
Texttxt PYTHIA6_JPsiWithFSR_7TeV_cff_py_GEN_SIM_DIGI_L1_DIGI2RAW_RAW2DIGI_L1Reco_RECO_MC.py.txt r1 manage 10.9 K 2010-02-02 - 17:25 FlorianTeischinger  
Texttxt PYTHIA6_JPsiWithFSR_MC_3XY_V16_1E31_HLT.py.txt r1 manage 26.2 K 2010-02-04 - 20:19 FlorianTeischinger  
PNGpng arrow_button.png r1 manage 1.0 K 2008-08-14 - 12:37 FlorianTeischinger  
Unknown file formatcfg crab.cfg r1 manage 0.9 K 2010-02-02 - 17:22 FlorianTeischinger  
GIFgif datastream.gif r1 manage 11.0 K 2008-08-17 - 21:53 FlorianTeischinger  
GIFgif datastream2.gif r1 manage 17.3 K 2008-08-17 - 22:12 FlorianTeischinger  
GIFgif datastream3.gif r2 r1 manage 16.9 K 2008-08-17 - 22:29 FlorianTeischinger  
PNGpng fl2.png r1 manage 120.7 K 2008-08-14 - 12:53 FlorianTeischinger  
PNGpng flashlist.png r1 manage 97.7 K 2008-08-14 - 12:49 FlorianTeischinger  
Texttxt jpsianalyzerpat_cfg.py.txt r1 manage 1.7 K 2010-02-04 - 20:22 FlorianTeischinger  
Texttxt onia2MuMuPAT_cff.py.txt r1 manage 7.3 K 2010-02-04 - 20:23 FlorianTeischinger  
Texttxt onia2MuMuPAT_cfg.py.txt r1 manage 3.1 K 2010-02-04 - 20:21 FlorianTeischinger  
PNGpng panel.png r1 manage 36.5 K 2008-08-14 - 12:57 FlorianTeischinger  
PNGpng popup.png r1 manage 104.5 K 2008-08-14 - 12:58 FlorianTeischinger  
PNGpng popup2.png r1 manage 91.0 K 2008-08-14 - 13:01 FlorianTeischinger  
Edit | Attach | Watch | Print version | History: r39 < r38 < r37 < r36 < r35 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r39 - 2010-10-05 - FlorianTeischinger
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2022 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback