Datasets on hltperf-action-x5650

Description Path Stats
NB data from the farm nodes (BWdiv stream) /localdisk/bw_division/run164440/* ~5M events, ~1200 files, 16 GB tot
pNe data from Patrick Robbe /localdisk/hlt1/data2015_pNe/* 90760
pp data from Mika, BW division stream + 0x00A3 (Physics1200) filtered /localdisk/bw_division/run164440_L0Filtered_0x00A3_Mika/* 30243 events, 6 files, 1.6 GB tot
pp data from Mika, BW division stream + 0x00A2 (Physics1600) filtered /localdisk/bw_division/run164440_L0Filtered_0x00A2_Mika/* 21324 events, 6 files, 1.1 GB tot
In some cases, the script and log file are in the same directory.

In the instructions below, the data is specified in scripts/HltRateTests/runHlt_parallel_on_test_node.py

Step 1: MooreOnline setup on plus

You can also use lb-dev if you prefer Change the package list as you prefer
SetupMooreOnline v24r1 --build-env                                                                                    
getpack -p anonymous -s Hlt/Hlt2CommissioningScripts head                                                             
getpack -p anonymous -s Hlt/HltSettings head                                                                          
getpack -p anonymous -s Hlt/Hlt2Lines head                                                                            
getpack -p anonymous -s Hlt/Hlt1Lines head                                                                            
cmt br make -j 8                                                                                                      
cd Hlt/Hlt2CommissioningScripts                                                                                       
SetupMooreOnline v24r1                                                                                                

Step 2: run on hltperf-action-x5650

Suggestion: you may want to do this step within screen since with 10k events per Moore process, it takes about 30 minutes. cd to $User_release_area/MooreOnline_v24r1/Hlt/Hlt2CommissioningScripts To get the correct environment:
source /cvmfs/lhcb.cern.ch/group_login.sh
You should make sure that scripts/HltRateTests/RunMooreFromSettings.py has the correct ThresholdSettings that you want to use. The following script will launch one Moore process per input file (defined by a wildcard in the script) For each input file, it will auto generate:
  • An output directory (output<job-number).
  • A .py pointing to the specific data file
python scripts/HltRateTests/runHlt_parallel_on_test_node.py
These will run in the background. You can follow the progress by doing e.g.,
cat output1/stdout
Each output directory should contain a "stdout", "stderr", "hists.root", "tuples.root"

Step3: prepare the output

The following scripts expect to get a list of "stdout" files in "logs"
ls output*/stdout > logs
And a merged root file called "merged.root"
hadd -f -k merged.root `ls output*/*.root`

Step4: analysis

mkdir webdir
./scripts/HltRateTests/MakeHTML_FromLogs.py logs webdir
./scripts/HltRateTests/StudyHLT1Tuple.py
./scripts/HltRateTests/StudyHLT2Tuple.py
./scripts/HltRateTests/MakeHTML_Hlt1Rates.py webdir/
./scripts/HltRateTests/MakeHTML_Hlt2Rates.py webdir/
./scripts/HltRateTests/MakeHTML_RatesByStream.py  webdir/
./scripts/HltRateTests/MakeHTML_RateMatrix.py  webdir/
Then just copy the webdir directory somewhere on the web

-- MikaVesterinen - 2015-10-01

Edit | Attach | Watch | Print version | History: r4 < r3 < r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r4 - 2015-10-11 - MikaVesterinen
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LHCb All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback