Setting up the test

download the test jobs:

 cd src/
 mkdir Tests
 cd Tests/
 cvs co -d TestJobs UserCode/leo/Utilities/PerfToolkit/TestJobs/
 cd TestJobs/
 scram b



Check that in your test dir the following files are available:

Then, create a crab cfg like:

jobtype = cmssw
scheduler = glite
events_per_job = 50000
output_file =  cmssw_net.log, cmssw_vmstat.log, cmssw.xml, cmssw.stdout
return_data = 1
ui_working_dir = Site.T2_CH_CSCS-Cfg.JPE-Dataset.RelValProdTTbarJobRobotMC_3XY_V24_JobRobotv1-EventsJo\
additional_input_files =
copy_data = 0
publish_data_name = name_you_prefer
rb = CERN
se_black_list = T0,T1

If you want to use some scripts, use the crab.template file and the file, like: T3_CH_PSI CMSSW_3_7_0_pre4_Brian2nd 50000 /RelValProdTTbar/JobRobot-MC_3XY_V24_JobRobot-v1/GEN-SIM-DIGI-RECO

TODO: update the script on CVS


If you want to test single jobs (eg from a dedicated WN), then crab is not an option. Use, instead, the script, eg:

./ JPE

please check the EVENTS and SW variables:

$ cat



DIR=Site.T3_CH_PSI-Cfg.${CFG}-Dataset.RelValProdTTbarJobRobotMC_3XY_V24_JobRobotv1-EventsJob.${EVENTS}-Sw.${SW}-Date.`date +%Y%m%d%H%M`-Label.SingleJob
mkdir $DIR

#eval `scram ru -sh`

vmstat -nd 10 &> ${DIR}/${LOG}_vmstat_1.log  &
./ ${DIR}/${LOG}_net_1.log &
sleep 60
( /usr/bin/time cmsRun -j ${DIR}/${LOG}_1.xml ${CFG}.py ) &> ${DIR}/${LOG}_1.stdout

Analyzing the results

First of all, download the proper scripts:

cvs co -d PerfToolKit UserCode/leo/Utilities/PerfToolkit/
cvs co -d PerfToolKit UserCode/leo/Utilities/PerfToolkit/
cvs co -d PerfToolKit UserCode/leo/Utilities/PerfToolkit/
cvs co -d PerfToolKit/plugins UserCode/leo/Utilities/PerfToolkit/plugins


The first step is to create the rootfiles containing the needed information:

$ python PerfToolKit/ --type=CMSSWCRAB Site.T2_CH_CSCS-Cfg.JPE-Dataset.RelValProdTTbarJobRobotMC_3XY_V24_JobRobotv1-EventsJob.50000-Sw.CMSSW_3_7_0-Date.201006061948

--type identifies the workflow you've used. It can take the values:

  • CRAB: a CMSSW job sent through CRAB
  • CMSSW: a CMSSW job executed stand alone
  • CMSSWCRAB: a stand-alone CMSSW jobs executed (through a script) with CRAB

Then, we need to create the tables and the graphs. The script is and takes the following arguments:

  • The list of rootfiles (separated by a space) which contains the information. * and ? wildcards are supported
  • = --save-png= : Saves created histos in png format
  • --save-root: Saves created histos in a ROOT file. If enabled, these histos will be not drawn on screen
  • --no-auto-bin: Disables automatic histo binning
  • --binwidth-time=BINWIDTHTIME: Bin width of time histos in seconds
  • --no-plots: Do not draw plots, only outputs the summary tables
  • --label=LABEL: Label to be used in naming plots, etc
  • --mode=MODE: Preconfigured modes for analysis: SiteMon, SiteMonExt, SiteCfrExt, Default (default value smile ). This drives which quantities are examinated and the output style

For example, to perform a Site monitoring during time:

$ python PerfToolKit/ --mode=SiteMonExt *CSCS*.root

The behaviour of different "modes" can be configured in the setCPTMode(mode) function defined in Warning: some histograms may be not plotted when the contained values are too small (e.g. User time~ 10 secs). You can try setting a more fine-grained binwidth, e.g. --binwidth-time=5

-- LeonardoSala - 17-May-2010

Edit | Attach | Watch | Print version | History: r4 < r3 < r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r4 - 2010-06-21 - unknown
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback