2012 photon energy scale and resolution with Z→μμγ events

Step 1 : Framework installation

Set correct environment

      source ${VO_CMS_SW_DIR}/cmsset_default.sh
      source /afs/cern.ch/cms/LCG/LCG-2/UI/cms_ui_env.sh
      source /afs/cern.ch/cms/ccs/wm/scripts/Crab/crab.sh
      export SCRAM_ARCH="slc5_amd64_gcc462" 

  • 2012 :
      cd /.../CMSSW_5_3_X/src 

  • 2011 :
      cd /.../CMSSW_4_2_X/src

To install a CMSSW version use :

      scram project CMSSW CMSSW_X_X_X 

  • If you are not on lxplus, use these lines to download on cvs :
      export CVSROOT=:gserver:cmscvs.cern.ch:/local/reps/CMSSW
      kinit -5 login@CERN.CH

Step 2 : IpnTreeProducer and Regression installation

  • 2012 22Jan rereco, photon regression V3 + V4 + V5 (Not stable for now !!!):
      cd /.../CMSSW_5_3_X/src
      cvs co -r regressionMay18b RecoEgamma/EgammaTools
      cvs co -r V05-08-20 RecoEcal/EgammaCoreTools
      checkdeps -a
      git clone -b hggpaperV5 https://github.com/bendavid/GBRLikelihood.git HiggsAnalysis/GBRLikelihood
      git clone -b hggpaperV5 https://github.com/bendavid/GBRLikelihoodEGTools.git HiggsAnalysis/GBRLikelihoodEGTools
      git clone -b RECO_5_3_3_v4 https://github.com/IPNL-CMS/IpnTreeProducer.git
      (If you are in ccage, use in addition git checkout -b RECO_5_3_3_v4)
      scram build 
      cd /.../CMSSW_5_3_X/src/IpnTreeProducer/src/

  • 2012, photon regression V3 :
      cd /.../CMSSW_5_3_X/src
      cvs co -r regressionMay18b RecoEgamma/EgammaTools
      cvs co -r V05-08-20 RecoEcal/EgammaCoreTools
      checkdeps -a
      git clone -b RECO_5_3_3_v4 https://github.com/IPNL-CMS/IpnTreeProducer.git
      (If you are in ccage, use in addition git checkout -b RECO_5_3_3_v4)
      scram build 
      cd /.../CMSSW_5_3_X/src/IpnTreeProducer/src/

  • 2011, photon regression V2 :
      cd /.../CMSSW_4_2_X/src
      cvs co -r regression_Dec3d RecoEgamma/EgammaTools
      cvs co -r V00-02-04 CondFormats/EgammaObjects
      cvs co -r regression42x_Dec3 CondFormats/DataRecord
      addpkg RecoEcal/EgammaCoreTools
      cvs update -r V05-08-02  RecoEcal/EgammaCoreTools/src/EcalClusterTools.cc
      checkdeps -a
      git clone -b RECO_4_2_8_v4 https://github.com/IPNL-CMS/IpnTreeProducer.git
      (If you are in ccage, use in addition git checkout -b RECO_4_2_8_v4)
      scram build 
      cd /.../CMSSW_4_2_X/src/IpnTreeProducer/src/

Step 3: Creation of toto-uples :

Preliminary steps

      cd /.../CMSSW_X_X_X/src/IpnTreeProducer/test/
      git clone git@github.com:lsgandurra/IpnTreeProducer_scripts.git
      cd IpnTreeProducer_scripts

Modify the default code

You have to modify the different codes in IpnTreeProducer_scripts :

  • crab_*201*_*.cfg
    • lumi_mask, lumi_mask, pset, user_remote_dir, ui_working_dir
  • the corresponding toto_*.py file
    • Put the good Global Tag :
      • process.GlobalTag.globaltag = cms.string('HERE')

To find the good Global Tag, you can use this command :

      dbsql "find dataset.tag where dataset like NAME_OF_YOUR_DATASET"

Running and monitoring of grid jobs

      crab -cfg crab_X_.cfg -create 
      crab "your directory" -submit
      crab "your directory" -get
      crab "your directory" -report

Check if everything is ok and create listFiles

      source listGoodFiles.sh "your directory"

Numbers after each cut

  • 2012 :

      source getPathNumbers_2011.sh

  • 2011 :

      source getPathNumbers_2011.sh

Step 4: Creation of miniTrees :

Download the code

      cd /.../CMSSW_X_X_X/src/

  • 2012 Selection :

      git clone -b 2012_official_v1 git@github.com:lsgandurra/ZmmgStudies.git

  • 2011 Selection :

      git clone -b 2011_official_v1 git@github.com:lsgandurra/ZmmgStudies.git


      cd /.../CMSSW_X_X_X/src/ZmmgStudies/Selection
      ln /.../CMSSW_X_X_X/src/IpnTreeProducer/interface/
      ln /.../CMSSW_X_X_X/src/IpnTreeProducer/src/libToto.so
      make clean && make

What you MUST check before running the code

You must verify :

  • in Selection_miniTree.h that the pileup weights are ok.
  • that you chose the good set of SC corrections.
  • that the rochester muon corrections are up to date (see Rochester muon momentum corrections).
  • the lumi numbers in Selection_miniTree.C (search integratedLuminosity).
  • the cross sections for the different processes in Selection_miniTree.C (search XSection*).
  • the initial numbers of MC events corresponds to your dataset in Selection_miniTree.C (search InitialNumber*) (they correspond to th number of events before running the totouples).
  • the scale factors for the HLT and the tight muID in Selection_miniTree.h (search weight_hlt_scaleFactors and weight_tightMuId_scaleFactors).
  • the rescaling of r9 in Selection_miniTree.h (search doR9Rescaling).

Running the code

First, you need to copy the listFiles* you created with listGoodFiles.sh in /.../CMSSW_X_X_X/src/ZmmgStudies/Selection. Then, you have to modify lanceurJobs.sh :

  • Put the samples corresponding to your listFiles*.
  • Choose the good pileup_set, lumi_set, isZgamma (0 = data, 1 = MC FSR, 2 = MC nonFSR, 3 = WJets and TTJets) and scCorrection (default = 0, regression SC corrections = MITregression).
  • For 2012, put lowMuMuCut=35 and highMuMuCut=9999. For 2011, put lowMuMuCut=40 and highMuMuCut=80.
  • If you don't want to run systematics, let itoy=0 and inj_resolution=0.
  • You can modify the name of the output.

Finally, to run the code :

      source lanceurJobs.sh

Check and merge outputs

When all the jobs are done and if you have no failed jobs (ls *.fail), you can merge the miniTrees :

  • in lanceurJobs.sh :
    • Comment qsub batchJob.sh ....
    • Uncomment hadd miniTree_...

Then, do :

      source lanceurJobs.sh

Finally :

  • in lanceurJobs.sh :
    • Comment hadd miniTree_...
    • Uncomment the 3 lines beginning by mv ...

And then, do :

      source lanceurJobs.sh

Generation of the events list (optionnal) :

You can modify the cuts in EventsList.C. Then, run this code :

      source EventsList.sh miniTree_name

Creation of RECO samples of our selection (optionnal) (cvs version):

In CMSSW_X_X_X/src/ do :

      cvs co -r CMSSW_X_X_X PhysicsTools/Utilities
      scram build
Copy the outputs (= .txt files) of EventsList.sh in PhysicsTools/Utilities/scripts. Then run edmPickEvents.py as follows (before creating crab jobs, you can modify pickevents_crab.config) :
      edmPickEvents.py "Dataset_name" events.txt --crab 
      crab -create -cfg pickevents_crab.config
      crab -submit
      crab -getoutput

A complete description of pick events can be found here: How to Pick Events

Step 5 : Energy Scale extraction

Check if everything is ok :

      cd /.../CMSSW_X_X_X/src/ZmmgStudies/EnergyScaleAndResolution/

  • You have first to edit SFits.cpp and put the miniTree*.root you want instead of the default ones.
  • SFits.cpp is the main macro. It loads functions.h (general functions + χ2 + ... ) and fitFunctions.h.
  • Compilation :

      source /afs/cern.ch/sw/lcg/external/gcc/4.3.2/x86_64-slc5/setup.sh
      source /afs/cern.ch/sw/lcg/app/releases/ROOT/5.34.04/x86_64-slc5-gcc43-opt/root/bin/thisroot.sh
      make clean && make

  • Edit launcherAllBatch.sh and choose what you want to run :
    • dataType : MC, data.
    • fitVariable : mmg_s, mmg_s_true, Mmumugamma, mmg_s_surface ...
    • fitPercentage (-> differents fit ranges) : default 60 to 100.
    • injectedResolution : should correspond to the one you applied in the miniTrees (default is 0).
    • You should change the name of the output directory (Results_v6_RecoEnergy for now).
    • Choose the cut variable : default is Photon_Et.
    • Choose the number of bins (in cut variable) : default is LimitesAllPtOneBin.txt

  • Edit batchJob.sh :
    • Change the HOMEDIR directory and the set-up environment file (default 537_RECO_5_3_3_v4.sh).
    • Under the line COPY EXECUTABLE TO WORKER, copy what you need (.exe, *.txt, *.sh, miniTree.root).
    • You can remove or add fit functions (default : voigtian, cruijff) .
    • Change the different pathes under the line GET BACK OUTPUT FILES TO SPS AND REMOVE THEM FROM DISTANT DIR.

  • Then you can launch jobs on batch :

      source launcherAllBatch.sh

Verify is all the jobs are done and do the plots :

  • Check if all the jobs ran correctly, you can use :

      ./CheckAllFits.exe directory_name dataType fitVariable cutVariable binFileName lowFitRange highFitRange injectedResolution

    • If you have some failed jobs, this macro will generate a file : jobsToResubmit*.sh. You must then do :

      source jobsToResubmit*.sh

  • Do the plots and select the best fit ranges (+ do the fit ranges systematics) :
    • Edit launcherCombinedPlotter.sh and choose the good directory, dataType, fitVariable and fitFunction.
    • Then do :

      source launcherCombinedPlotter.sh

    • You will then have in your_directory/InjectedResolution_*Percent/dataType 2 new directories :
      • Selected_Fits, where you have a Summary*.txt file with all the usefull informations and the choosen fits for all the categories.
      • CombinedGraphs, containing some plots of p-values, fit-ranges...

Step 6 : Data / MC comparisons

      cd /.../CMSSW_X_X_X/src/ZmmgStudies/EnergyScaleAndResolution/

  • Edit Data_MC_Var_Comparison.cpp. Modify the cuts if you want and put your miniTrees (just the names, not the complete pathes).
  • Then open launcherData_MC_Comparison.sh and choose the directoryName

      source launcherData_MC_Comparison.sh

  • When all the jobs will be over, you will have a new directory called directoryName with all the plots inside.

Step 7 : Muon and fit function systematics :

Muon systematics

      cd /.../CMSSW_X_X_X/src/ZmmgStudies/Selection

  • Edit lanceurJobs_muSys.sh : put the good samples and change the different options if you like (the option should similar to the ones you have in = lanceurJobs.sh=). Then uncomment the line qsub... and comment the rest. You can change the miniTree name if you like.

  • Then launch the jobs on batch :

      source lanceurJobs_muSys.sh

  • If all the jobs are ok (no *.fail files in your directory), you can merge the miniTrees (comment the line with qsub... and uncomment the line with hadd..., then source lanceurJobs_muSys.sh).

  • You have now 100 new miniTrees per datasets. In each one, muon momentum is smeared by a random number given by a gaussian of width DeltaK.

      cd /.../CMSSW_X_X_X/src/ZmmgStudies/EnergyScaleAndResolution/

  • Open launcherAllBatch_muSys.sh and change the directory name and other oprion if you like.
  • Edit batchJob_muSys.sh and put the good pathes, HOMEDIR and environment file. Then under COPY EXECUTABLE TO WORKER, copy the files you need (.exe, .txt, .sh and miniTrees*.root)
  • Edit SFits.cpp and put the good miniTrees after if(muSys > 0 && extraScale = "0")=. Don't forget to do a make !
  • Modify SelectedFits_MC.txt and SelectedFits_data.txt with the good numbers stored in your_directory/InjectedResolution_*Percent/dataType/Selected_Fits/
  • Then you can launch the jobs on batch :

      source launcherAllBatch_muSys.sh

  • Ones all the jobs are ok, open and modify launcherMuonsSystematics.sh and finaly do :

      source launcherMuonsSystematics.sh

  • The results are stored in directory_MuSys/dataType/MuonSystematics/.

Systematics coming from the choice of the fit function (works with Voigtian as truth and Cruijff as test model for now) :

      cd /.../CMSSW_X_X_X/src/ZmmgStudies/EnergyScaleAndResolution/

  • Edit batchJob_fitFunctionSystematics.sh and change the different pathes, the HOMEDIR and 537_RECO_5_3_3_v4.sh.
  • Then open launcherFitFunctionSystematics.sh and put the good directory name and variables you want to use.

      source launcherFitFunctionSystematics.sh

  • If everything went well, you should now have a new directory : your_directory/InjectedResolution_*Percent/dataType/Selected_Fits/FitFunctionSystematics/. Inside, you will find some plots and a file called Summary_fitFunctionSystematics.txt with all the systematics for the different categories.
Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r6 - 2013-11-13 - LouisSgandurra
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback