Spike Killer Study How To

Parts of How To

  • Before starting and helpful tips
  • TPG Analysis with varying pedestals
  • L1EG Analysis with chosen pedestal and varying laser conditions
  • L1EG Analysis with varying pedestals and the same laser conditions

Before starting and helpful tips

  • Make sure you have a lxplus account
  • Make sure you have a grid certificate (to submit batch jobs)
    • voms-proxy-init --voms cms activates it
  • Make sure you have enough space
    • Need an eos directory
      • eos/cms/store/caf/user is the directory ecaltrg scripts uses
      • eos/user works, just requires more changes to the code
    • Good idea to use your work directory instead of your home directory
  • To scp from the ecaltrg account you will need the password.
    • The package from GitHub is likely to not be up to date, so it is recommended to scp any script you need to use from the ecaltrg account.
  • When looking for the latest files, ls -alptrh is useful.
  • To look for words in a file (most likely to make sure you replaced all of the eos directories properly) grep -R "eos" is useful.
  • Make sure to recompile whenever you change a .cc file (other files are fine).
  • These scripts can take a while to run. Launch them on a separate screen. Quick tutorial:
    • Type 'screen' to get onto a screen.
    • Launch command, type ctl+a d to detach screen.
    • Type 'screen -r' to reattach screen, must be on same lxplus machine (ex lxplus044) that the screen was detached from.

TPG Analysis with varying pedestals

First you need a CMSSW release. It is recommended to set it up with a name that is descriptive of the project, in this case it is named the same as the package in the ecaltrg account. Go to the directory you will be using and do:

scram p -n TPLasVal_901 CMSSW CMSSW_9_0_1 
cd src
git cms-init

Now to add the required packages and compile

git cms-addpkg CondTools/Ecal
git cms-addpkg CalibCalorimetry/EcalTPGTools
git clone https://gitlab.cern.ch/ECALPFG/EcalTPGAnalysis.git
scram b -j 8

Remember that the git packages are likely not the most up to date, so scp any script you need to work with from the ecaltrg directory is recommended.


The end goal of this section is to create a parameter .txt file. For this you first need to get the newest version of the files. In the case of the hourly transparencies, make sure you get all of the runs you want to use. For example, I compared runs 297293 and 297488 and so did:

cd TPLasVal_901/src/CalibCalorimetry/EcalTPGTools/test/
scp ecaltrg@lxplus.cern.ch:/afs/cern.ch/work/e/ecaltrg/TPLasVal_901/src/CalibCalorimetry/EcalTPGTools/test/produceTPGParameters_beamv6_transparency_spikekill_2017_script_298481.py . 
mkdir Transparency_2017
cd Transparency_2017
scp ecaltrg@lxplus.cern.ch:/afs/cern.ch/work/e/ecaltrg/TPLasVal_901/src/usercode/DBDump/utils/hourly_297293 . 
scp ecaltrg@lxplus.cern.ch:/afs/cern.ch/work/e/ecaltrg/TPLasVal_901/src/usercode/DBDump/utils/hourly_297488 .
cd .. 

The next step is to modify the .py file. Rename it to something appropriate, such as produceTPGParameters_beamv6_transparency_spikekill_2017_script_297488_Pedes_$pedval.py where $pedval is the pedestal number (I used 296917 and Default). You will likely need to make two .py files, since it is a comparison, but the same procedure works on both. Open the renamed file for editing. Required changes are:

  • Change the outfile name (it is the .txt file name) to match the .py file name.
  • Change transparency_corrections = cms.string('../../../usercode/DBDump/utils/hourly_298481') so it reads your hourly transparency (for example transparency_corrections = cms.string(‘Transparency_2017/hourly_298560') )
  • For default pedestal, comment out the line: connect =cms.string('sqlite_file:////afs/cern.ch/work/e/ecaltrg/public/ECAL/PEDES/Pedes_298481.db').
  • For different pedestal values, change it to point to the desired one. Recommended to check that it is there (ex, probably have to change PEDES to PEDES_2017).

After the changes are saved run the .py file using the command cmsRun. Make sure you create all desired .txt files. Next up is to make the database (.db) files.


If you are still in the CalibCalorimetry directory,

cd ../../../CondTools/Ecal/test/

The file we want to run to produce the .db files is sqlite_prompt.sh. So we need to scp that file and every file it uses from the ecaltrg account to make sure we have the most up to date file. A summary of the files used and the changes to be made is:

  • sqlite_prompt.sh
    • Alter the rm-mv portion so it moves the .db files to a location of your choice.
  • EcalTPGLinPed_cfg.py
    • Add to the Ecal/python directory
  • Ecal_Laser_weekly_Linearization_cfg.py
    • Add to the Ecal/python directory
  • Ecal_Laser_weekly_cfg.py
    • Add to the Ecal/python directory
  • EcalLaser_weekly_Handler.h
    • Add to the Ecal/src directory
  • EcalLaser_weekly_Handler.cc
    • Add to the Ecal/src directory
    • Change so it uses your hourly transparency
  • EcalLaser_weekly_Linearization.cc
    • Add to the Ecal/src directory
    • Change so it uses your .txt file (from the previous section)
  • EcalTPGLinPed.cc
    • Add to the Ecal/src directory
    • Change so it uses your .txt file (from the previous section)
  • RunStartTime_$runnb
    • Add to the Ecal/python directory
    • Make sure to add the one(s) that matches the run(s) you are using.

After all necessary changes are made and everything is recompiled (since some .cc files were changed) run on all runs you want to analyze. For example, for run 297488 do

./sqlite_prompt.sh 297488
Make sure the .db files are in the location you choose. If you plan to compare different pedestals while holding everything else the same place them in properly named directories or rename the .db files appropriately.


Go to the /EcalTPGAnalysis/Scripts/TriggerAnalysis/ directory. The file we want to run is tpganalysis_prompt.sh. So we need to scp that file and every file it uses from the ecaltrg account to make sure we have the most up to date file. Most of the changes are changing the eos directories to your eos directories if you are using them. A summary of the files used and the changes to be made is:

  • tpganalysis_prompt.sh
    • Change eos directory if needed
    • Change the queue from 'cmscaf1nd' to '8nh' or similar if needed
    • Change the mail to go to your email
  • runTPGbatchLC_prompt.sh
    • Change eos directory if needed
    • Change the queue from 'cmscaf1nd' to '8nh' or similar if needed
    • Change the sqlite file to your .db file
    • Might need to add 'mkdir -p /eos/user/c/cschiber/TPG/${runnb}' (with your user name and initial) if that eos directory is causing problems.
    • Delete /afs/cern.ch/project/eos/installation/0.3.84-aquamarine/bin/eos.select from in front of the mkdir
    • Make sure to copy your grid proxy (for example x509up_u44852) from where you have it
  • merge_and_makePlots_prompt.sh
    • Change eos directory if needed
  • zombie.sh
    • No changes needed
  • x509up_u44852
    • Need one (in the format of x509up_uXXXXX)
    • Use voms-proxy-init --voms cms to create one in the /tmp/x509up_uXXXX directory, copy to someplace in your directory
  • ecalTPGAnalysisTEMPLATE_weekly.py
    • Place in configuration directory
  • batch_template_weekly.sh
    • Place in configuration directory
    • Change eos directory if needed
  • mergeTPGAnalysis.sh
    • Change output_dir to your eos directory
    • If you are using your eos directory, also change output_root_dir into output_root_dir=${output_dir}
    • Also change xrdcp to cp (make sure to get all three) if needed (note: xrdcp is supposed to be used in eos directories so only change if you are using a directory where it does not work)
  • makeTrigPrimPlots.sh
    • Change eos directory if needed
    • Change eos cp to cp if needed
  • validationplots_prompt.sh
    • Place in EcalTPGAnalysis/TPGPlotting/plots
    • Change to your www directory if needed (can get a www directory in the lxplus eos/user directory, places the plots online)
  • comparePlots_prompt.C
    • Place in EcalTPGAnalysis/TPGPlotting/plots
    • Change all www directories to yours

To run, do:

./tpganalysis_prompt.sh 297293 297488 274240 19

The first two parameters are the two runs you are comparing, the third is the reference run, and the fourth is the week number. This command should make the ntuples (check whatever eos directory they are supposed to be in). To produce the plots, go to TPGPlotting/plots and run (using the same input parameters):

./validationplots_prompt.sh 297293 297488 274240 19

If the plots show up in the directory but not online make sure you are sharing the www folder on eosbox. It is recommended to try to recreate some plots first to see if everything is running properly.

L1EG Analysis with chosen pedestals and varying laser conditions

Once again we need a CMSSW release. In the desired directory do:

scram p -n L1EGLasVal_901 CMSSW CMSSW_9_0_1 
cd src
git cms-init

For this one I just scp everything I needed from the ecaltrg directory of the same name. Do not forget to compile. The changes are all similar to what was required for the TPG analysis. A summary of the changes:

  • l1eganalysis_prompt.sh
    • If needed change the queue cmscaf1nd to 8nh
    • If needed change the eos directory to your eos directory
    • Change email to your email
  • zombie.sh
    • Remove root://eoscms/ if you changed the eos directory to a local one
  • runL1NtuplebatchLC_prompt.sh
    • Change 1legdir to where your L1EGLasVal_901 is
    • Change the queue to 8nh if needed
    • Change the outdir to your eos dir if needed
    • Change the sqlite file to your .db file (the same .db file made in the TPG analysis section)
    • Change the filename (in the sed portion) if needed
  • batch_TEMPLATE.sh
    • In configuration directory
    • Change eos directory if needed
  • l1Ntuple_RAW2DIGI_TEMPLATE.py
    • In configuration directory
  • merge_prompt.sh
    • Change eos directory if needed
  • mergeL1Ntuples_prompt.sh
    • Comment out the eos in eos select and remove all the eos in eos cp
    • If needed change eos directory and remove the root://eoscms// if it is a local directory
    • Change xrdcp to cp if needed
  • plotRate_prompt.sh
    • Change all outdirs to yours, for both eos and www
  • Plot_RateDat_prompt.C
    • Change all www and eos directories to yours

To run, use the same format of command as the TPG analysis:

./l1eganalysis_prompt.sh 297293 297488 283171 19 2017

It does not need any additional scripts to create the plots.

L1EG Analysis with varying pedestals and the same laser conditions

This analysis is just a minor modification of the regular L1EG analysis. The largest modification is using two different pedestal values, which requires two different .db files. I solved this by placing the files in different directories without bothering to change the names of the .db file. The modified files all have the _prompt replaced by _pedes. A summary of additional changes:

  • l1eganalysis_pedes.sh
    • Modify so the input is of the format ./l1eganalysis_pedes.sh
    • Make sure to make all the necessary changes to discern between pedestal values instead of laser correction values (anything that was runnb_lc1 or runnb_lc2 becomes runb_lc, need to add pedval1 or pedval2 to names or inputs)
  • zombie_pedes.sh
    • Change the file names to match the ones made in l1eganalysis_pedes.sh
  • runL1NtuplebatchLC_pedes.sh
    • Add input variable to determine pedestal value used
    • Change directory names to include pedestal value (including the runNb of the sed section)
    • Make sure the sqlite file uses the right .db file (for for example, use /TPLasVal_901/src/CondTools/Ecal/test/Pedes_${pedval}/EcalTPG_${runnb_lc_iov}_moved_to_1.db if the .db file is placed in said directory)
  • batch_TEMPLATE_pedes.sh
    • In configuration directory
    • Actually do not need to make any changes as long as the runNb from runL1NtuplebatchLC_pedes includes the pedestal value
  • l1Ntuple_RAW2DIGI_TEMPLATE.py
    • In configuration directory
    • No changes required
  • merge_pedes.sh
    • Change directory name to include pedestal value
    • mergeL1Ntuples_pedes.sh will be changed to include the pedestal value as input, so include it (for example add -p ${ped_val})
  • mergeL1Ntuples_pedes.sh
    • Need to modify to include the pedestal value as a variable and include it in the run_id
  • plotRate_pedes.sh
    • Change so it only has one runnb_lc and two pedvals (changes lots of directory names, and the input to Plot_RateDat_pedes.C)
  • Plot_RateDat_pedes.C
    • Change main function to void Plot_RateDat_pedes(int runnb, int runnb_Ref, int ped_Old, int ped_New , int Week, int Year), modify so the variables New and Old all refer to the new and old pedestals.

-- CatherineCenedraSchiber - 2017-08-10

Edit | Attach | Watch | Print version | History: r14 < r13 < r12 < r11 < r10 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r14 - 2017-11-27 - CatherineCenedraSchiber
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2022 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback