Printers@CERN click display and install printers. It will display the nearby printers automatically. Click on the printer and follow instructions.


grid-proxy-init -debug -verify

to verify my certificate is valid

voms-proxy-init -voms cms

to check whether I am a member of CMS organization

Then do the following to setup CRAB source ~.bashrc

then export X509_USER_PROXY=`voms-proxy-info -path`

Then cmsenv to be able to run combine

then create the cfg via

python ~/CMSSW_7_1_5/src/HiggsAnalysis/CombinedLimit/test/ tH_total_BR24.txt 0.5 5.0 -n 20 -T 500 -r --lsf -o TotalBR24

then, you should open myOutPutName.cfg and set queu=2nw and number of jobs and events equal to 10

Using CRAB 2

To create and submit the jobs v

crab -cfg **.cfg -create -submit

then to check the status

crab -status -c

you can also check the status via the dashboard retrieved with the previous command, or you can use the command bjobs (faster). If the job crashed or just you don't like it anymore, then

crab kill all -c

crab -getoutput -c

i log file si trovano in nome_directory/res/CMSSW_XXX.stdout dove XXX e' il numero del job

Using LSF (Load Sharing Facility)

Always run locally a job to make sure it works, and to estimate the time needed.

bsub submitts a job - i.e. a sh script that says what needs to be done. bqueues gives data about the usage of the available queues. The queue 2nw is for jobs running up to 48 hours. bjobs followed by he job number checks the status bpeek followed by the job number gets the printed output of the running job

Example usage: "bsub -q 2nw". LSF creates a local directory. The <LSF_jobnumber/STDOUT> file gives the printed output.

Running limits with combine

Naming conventions:

Combine webpage with detailed explanations

combine -M Asymptotic -S 0 (no systs)

combine -M Asymptotic -S 1 (w systs)

combine -M HybridNew Datacard.txt --expectedFromGrid=0.5 --saveToys --fullBToys --saveHybridResult -T 2000 --frequentist

combine -M HybridNew Datacard.txt --expectedFromGrid=0.5 --saveToys --fullBToys --saveHybridResult -T 1000 -i 10 -s13 --freq --fork 8

(options cookbook: -S (0,1) turns systematics on/off. -expectedFromGrid=X.X sets the median, +-1/2 sigma bands, -T number of toys, -fork nCPU, -s random seed, -i iterations, --saveHybridResult and --sveToys are needed to compute +-1/2 sigma bands)

To add separate datacards (aka to perform combinations of separate channels) ch1=channel1.txt ch2=channel2.txt > combinedCard.txt

If the job takes too long, you can create the cfg via

python ~/CMSSW715NEW/src/HiggsAnalysis/CombinedLimit/test/ Datacard.txt 1.0 12.0 -n 20 -T 500 -r --lsf -o myOutPutName

and then run combine remotely, retrieve the output, then sum the several output root files with hadd into a single .root

for i in 0.025 0.16 0.5 0.84 0.975; do echo '== Running with r = '$i '=='; echo ' ' ; combine <my_datacard.txt> -M HybridNew --freq --grid=<output_file.root> --expectedFromGrid $i; echo ' '; echo ' ' ; done | grep '\(Running with r\|95%\)'

to check postfit nuisance, run

combine -M MaxLikelihoodFit -t -1 fullCombinedCard.txt --saveNLL --saveNormalizations --saveWithUncertainties

where the option -t -1 says don't look at the data yet. You will get a file called mlfit.root on which you can run the following script:

python ~/CMSSW715NEW/src/HiggsAnalysis/CombinedLimit/test/ mlfit.root

and then finally check the variations on the yields

python ~/CMSSW715NEW/src/HiggsAnalysis/CombinedLimit/test/ mlfit.root

Finally, running either -Asymptotic without the -t -1 option or -HybridNew without the --expectedFromGrid option will give you the observed result, while using the --significance will compute the significance

Working on lxplus remotely

learn how to use screen, oppure

nohup comando > logfile &

-- FabrizioMargaroli - 2014-11-27

Edit | Attach | Watch | Print version | History: r19 < r18 < r17 < r16 < r15 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r19 - 2015-06-18 - FabrizioMargaroli
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback