Grid stuff

Instructions when obtaining a GRID certificate

If you grid certificate is called cert.p12 You create a directory .globus in you $HOME and do the following operations:

conversion en .perm openssl pkcs12 -in cert.p12 -clcerts -nokeys -out usercert.pem openssl pkcs12 -in cert.p12 -nocerts -out userkey.pem

You are asked for your password Enter Import Pasword Enter PEM pass phrase Verifying PEM pass phrase

changement des permissions ls -l ~/.globus chmod 0400 ~/.globus/userkey.pem chmod 0444 ~/.globus/usercert.pem

verification du contenu openssl x509 -text -noout -in ~/.globus/usercert.pem

Instructions when renewing a GRID certificate =========================================== Save from the browser you grid certificate (example GRID_Certificate.p12) in ~/.globus directory

If you use FireFox to renew your certificate, save it like following:


Firefox->Preferences->Advanced->ViewCertificates->Backup You will be asked for a password to save the certificate I have chosen the same password that I use for using the grid certificate

once you copy your certificate to your directory on lxplus .globus remove previous keys and certificates first from this directory and then

split in private and public keys and give the correct access rights to your keys:


openssl pkcs12 -in GRID_Certificate.p12 -clcerts -nokeys -out usercert.pem openssl pkcs12 -in GRID_Certificate.p12 -nocerts -out userkey.pem

chmod 400 ~/.globus/userkey.pem chmod 600 ~/.globus/usercert.pem

HBSM stuff

Papers and notes

General (HSG2) Higgs stuff

HSG2 High mass general

Note

Signal shapes

Workspaces

RooStat

Issues with NWA validity

Misc links, talks etc

HSG2 High mass stat tools

NLL scan

CL limits

  • Use /afs/cern.ch/work/m/mdanohof/HSG2/CL/H4lHighMassLimits
  • After compiling (using the same ROOT as for the signal shapes etc), run it using run_test.sh (change the path to the ws or copy it locally).
  • The outputs will be in "root-files" directory (need to make this directory by hand before running), as a ROOT file with a histogram containing the expected, observed and +/- 1/2 sigma.
  • Look at the drawLimitPlot.py for an example of how to make the limit plot for this

Pulls and rankings of systematics

  • root -b -q macros/runPulls.C+\(\"500NWA_Asimov.root\",\"mu_ggF\",\"combined\",\"ModelConfig\",\"combData\",\"/afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihood/Workspaces/Workspaces2l2v2q/WithAsimov/\",NULL,0.005,0,\"DEBUG\"\)

HSG2 High mass combination

4l workspaces

  • Add asimov to the single workspace: /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihood AddAsimov.sh DefaultAsimov.py
    • Output: one workspaces in Workspaces/Workspaces4l/WithAsimov
    • Set mu_ggF and mu_VBFVH equal to same value

llvv workspaces

  • Add asimov to both ggF and VBF workspace: /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihood AddAsimov.sh DefaultAsimov.py
    • Output: two workspaces in Workspaces/Workspaces2l2v/WithAsimov
  • Merge ggF and VBF workspaces into one: /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/CombinationTool Merge2l2v.sh DefaultCombine.py (DefaultCombine should be changed)
    • Output: one workspace, continue with this one
  • In final workspaces ggF and VBF will be merged into one ws

llqq+vvqq workspaces

  • first combine ggF and VBF here /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihood/Workspaces/Workspaces2l2q
    • 1) add asimov to both ggF and VBF workspaces, use /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihoodAddAsimov.sh with DefaultAsimov2l2q.py
    • 2) merge ggF and VBF workspaces with /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/CombinationTool/Merge2l2q.py

Workspaces

4l: /afs/cern.ch/atlas/groups/HSG2/H4l_2013/Autumn/Workspaces/HighMass/v8/MCFM_withH125/prunedSyst_fixList

2l2q+2l2v: /afs/cern.ch/user/l/lezhang/public/HighMassHZZ/WSforCombination_20141127v1_mergedggfvbf

Branching ratios

  • Br(H->ZZ) (@125.5): 2.76e-2
  • Br(Z->ll, l=e,mu) =0.03363
  • Br(Z->vv,v=e,mu,tau)=0.067
  • Br(Z->uu,cc)=0.116
  • Br(Z->dd,ss,bb)=0.156
  • H->ZZ->4l: Br(H->ZZ) * [Br(ZZ->2e2mu) + Br(ZZ->2mu2e) + Br(ZZ->4e) + Br(ZZ->4mu)] = 2.76e-2 * [ 0.03363 *0.03363 + 0.03363*0.03363 + 0.03363*0.03363 + 0.03363*0.03363] = 1.25e-4
  • H->ZZ->2l2v: Br(H->ZZ) * [Br(ZZ->2l2v) + Br(ZZ->2v2l) ] = 2.76e-2 * [ 0.03363*0.067*6 ] = 3.73e-3 (here Br(ZZ->2l2v) = (2e2ve+2e2vμ+2e2vτ+2μ2ve+2μ2vμ+2μ2vτ) . Since we need the opposite, Br(ZZ->2v2l), we need to add factor 2)
  • H->ZZ->2l2v+H->ZZ->4l: Br(H->ZZ) * [Br(ZZ->4l)+Br(ZZ->2l2v)] = Br(H->ZZ) * [2e2μ +2μ2e +4e +4μ +2 * (2e2ve+2e2vμ+2e2vτ+2μ2ve+2μ2vμ+2μ2vτ)] = 8.6e-4
  • H->ZZ->2l2q: Br(H->ZZ) * [Br(ZZ->2l2q) + Br(ZZ->2q2l) ] = Br(H->ZZ) * [2*(2e2u+2e2d+2e2s+2e2b+2e2c)+2*(2mu2u+2mu2d+2mu2s+2mu2b+2mu2c)] = Br(H->ZZ) * [9.82e-2] = 2.7e-3
  • H->ZZ->2v2q: Br(H->ZZ) * [Br(ZZ->vvqq)+Br(ZZ->qqvv)] = Br(H->ZZ) * [ 2*(2ve2u+2ve2d+2ve2s+2ve2b+2ve2c)+2*( 2vmu2u+2vmu2d+2vmu2s+2vmu2b+2vmu2c)+2*(2vtau2u+2vtau2d+2vtau2s+2vtau2b+2vtau2c)] = Br(H->ZZ) * [0.281] = 7.76e-3
  • H->ZZ->2l2q+H->ZZ->2v2q: Br(H->ZZ) * [ Br(ZZ->2l2q) + Br(ZZ->2v2q) ] = Br(H->ZZ) * [9.82e-2+ 0.281] = Br(H->ZZ) * [0.379] = 1.04e-2

Workspace structure

  • The likehood in each workspace is saved in the simPdf. For example do: combined->pdf("simPdf")->Print(): RooSimultaneous::simPdf[ indexCat=channelCat VBFCat_2012=model_VBFCat_2012 VHCat_2012=model_VHCat_2012 ggF_2e2muCat_2012=model_ggF_2e2muCat_2012 ggF_2mu2eCat_2012=model_ggF_2mu2eCat_2012 ggF_4eCat_2012=model_ggF_4eCat_2012 ggF_4muCat_2012=model_ggF_4muCat_2012 ] = [#1] INFO:InputArguments -- RooStarMorphPdf::getCache(ATLAS_Signal_gg_H_ggF_4muCat_2012_m4l_shapeSys). This workspace contains 6 likelihoods (named model_*) for the 6 categories VBF cat, VH cat, ggF 2e2mu, ggF 2mu2e, ggF 4e, ggF 4mu.
  • Look into each likelihood, for example do: combined->pdf("model_ggF_4muCat_2012")->Print(): RooProdPdf::model_ggF_4muCat_2012[ alpha_ATLAS_LUMI_2012Constraint * alpha_ATLAS_JES_2012_Statistical2Constraint * alpha_ATLAS_JES_2012_Statistical3Constraint * alpha_ATLAS_JES_2012_Modelling1Constraint * alpha_ATLAS_JES_2012_Modelling2Constraint * alpha_ATLAS_JES_2012_Detector1Constraint * alpha_ATLAS_JES_2012_Eta_StatMethodConstraint * alpha_ATLAS_JES_Eta_ModellingConstraint * alpha_ATLAS_JES_NPVConstraint * alpha_ATLAS_JES_MuConstraint * alpha_ATLAS_JES_FlavComp_llll_BGConstraint * alpha_ATLAS_JES_FlavRespConstraint * alpha_ATLAS_JES_2012_PilePtConstraint * alpha_ATLAS_JES_2012_PileRho_llll_BGConstraint * alpha_ATLAS_UEConstraint * alpha_ATLAS_EM_mRes_MAT_CALOConstraint * alpha_ATLAS_EM_mRes_MAT_CRYOConstraint * alpha_ATLAS_EM_mRes_MAT_GAPConstraint * alpha_ATLAS_ggHZZllll_Acc_pdfConstraint * alpha_ATLAS_ggHZZllll_Acc_QCDscaleConstraint * alpha_ATLAS_ggHZZllll_Acc_showerConstraint * alpha_ATLAS_VBFHZZllll_Acc_pdfConstraint * alpha_ATLAS_VBFHZZllll_Acc_QCDscaleConstraint * alpha_ATLAS_VBFHZZllll_Acc_showerConstraint * alpha_ATLAS_MU_2012_TRIGConstraint * alpha_ATLAS_MU_EFFConstraint * alpha_ATLAS_MU_MS_RES_IDConstraint * alpha_ATLAS_MU_MS_RES_MSConstraint * alpha_QCDscale_VVConstraint * alpha_pdf_qqConstraint * alpha_QCDscale_ggVVConstraint * alpha_pdf_ggConstraint * alpha_ATLAS_JES_2012_Statistical1Constraint * alpha_ATLAS_JES_Eta_StatMethodConstraint * alpha_ATLAS_norm_SF_H4l_Zbb_llmumu_2012Constraint * modelunc_ATLAS_H_ggF_4muCat_2012 ]. All the terms called alpha_* are the NPs (product of them). The modelunc_* is the actual likelihood, eq. (1) in the 4l high mass paper. So this contains a full likelihood containing eq. (1) and NPs
  • Hver NP bestaar af: combined->pdf("alpha_ATLAS_LUMI_2012Constraint")->Print(): RooGaussian::alpha_ATLAS_LUMI_2012Constraint[ x=alpha_ATLAS_LUMI_2012 mean=nom_alpha_ATLAS_LUMI_2012 sigma=sigma ] = 1. En gaussisk constraint (alpha_*), en mean value (nom_*) og en bredde (sigma, som altid er 1)
  • modelunc_* bestaar af samme termer som eq. (1). combined->pdf("modelunc_ATLAS_H_ggF_4muCat_2012")->Print(): RooAddPdf::modelunc_ATLAS_H_ggF_4muCat_2012[ nTotATLAS_Signal_gg_H_ggF_4muCat_2012 * ATLAS_Signal_gg_H_ggF_4muCat_2012_m4l_shapeSys + nTotATLAS_Signal_VBF_H_ggF_4muCat_2012 * ATLAS_Signal_VBF_H_ggF_4muCat_2012_m4l_shapeSys + nTotATLAS_Bkg_qqZZ_ggF_4muCat_2012 * ATLAS_Bkg_qqZZ_ggF_4muCat_2012_m4l_withSys + nTotATLAS_Bkg_ggZZ_ggF_4muCat_2012 * ATLAS_Bkg_ggZZ_ggF_4muCat_2012_m4l_withSys + nTotATLAS_Bkg_reducible_llmumu_ggF_4muCat_2012 * ATLAS_Bkg_reducible_llmumu_m4l_nominal ]. Alstaa normaliserings termer (nTotATLAS*) og pdf'er (ATLAS_Signal/BKG*).
  • Normaliseringstermerne er yderligere bygget op af: combined->obj("nTotATLAS_Signal_gg_H_ggF_4muCat_2012")->Print(): RooProduct::nTotATLAS_Signal_gg_H_ggF_4muCat_2012[ nATLAS_Signal_gg_H_ggF_4muCat_2012 * mu * mu_ggF * fiv_ATLAS_Signal_gg_H_ggF_4muCat_2012 ]. Her er fiv_* summen af NP'erne. Den sum er lavet med en FlexibleInterpVar: combined->obj("fiv_ATLAS_Signal_gg_H_ggF_4muCat_2012")->Print(): RooStats::HistFactory::FlexibleInterpVar::fiv_ATLAS_Signal_gg_H_ggF_4muCat_2012[ paramList=(alpha_ATLAS_JES_2012_Detector1,alpha_ATLAS_JES_2012_Eta_StatMethod,alpha_ATLAS_JES_2012_Modelling1, alpha_ATLAS_JES_2012_Modelling2,alpha_ATLAS_JES_2012_PilePt, alpha_ATLAS_JES_2012_PileRho_llll_BG,alpha_ATLAS_JES_2012_Statistical2,alpha_ATLAS_JES_2012_Statistical3, alpha_ATLAS_JES_Eta_Modelling,alpha_ATLAS_JES_FlavComp_llll_BG,alpha_ATLAS_JES_FlavResp, alpha_ATLAS_JES_Mu,alpha_ATLAS_JES_NPV,alpha_ATLAS_LUMI_2012, alpha_ATLAS_MU_2012_TRIG, alpha_ATLAS_MU_EFF, alpha_ATLAS_MU_MS_RES_ID, alpha_ATLAS_MU_MS_RES_MS, alpha_ATLAS_ggHZZllll_Acc_QCDscale, alpha_ATLAS_ggHZZllll_Acc_pdf, alpha_ATLAS_ggHZZllll_Acc_shower) ].
  • s

HSG7 combination tool

HSG2 MC requests

Generator Process mH Events DS ID Date of request JIRA Comments
Sherpa ggllvv SBI   100k 206301 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Sherpa ggllvv B   100k 206302 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Sherpa ggllvv S   100k 206302 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Sherpa ggllvv S 440   100k 206304 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Sherpa ggllll, SBI   100k 206305 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Sherpa ggllll B   100k 206306 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Sherpa ggllll S   100k 206307 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Sherpa ggllll S 440   100k 206308 15/10/2014 https://its.cern.ch/jira/browse/ATLMCPROD-686 width
Powheg llvv 200 20M   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 220 5M   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 240 1M   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 260 500k   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 280 200k   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 300 200k   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 320 100k   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 340 100k   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 360 100k   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  
Powheg llvv 380 100k   3/12/2014 https://its.cern.ch/jira/browse/ATLMCPROD-888  

Old tickets

Compositeness

My talks

Notes uploading to arXiv

leer.eps --> leer.pdf, make sure all figures are eps-converted-to.pdf

Notes likelihood scan

Add Asimov data to workspace import ROOT rf = ROOT.TFile("workspace.root") ws = rf.Get("combined") mc = ws.obj("ModelConfig") asimov = ROOT.RooStats.AsymptoticCalculator.GenerateAsimovData( mc.GetPdf(), mc.GetObservables() ) asimov.SetName("asimov_4l") getattr(ws, "import")(aimov) ws.writeToFile("workspace_withAsimov.root")

BatchProfileLikelihood.py -i /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/CombinationTool/workspaces/work_v1_800_ggF_NWA.root -j 1 -c 0 --wsName=w_800 --dataFile=/afs/cern.ch/work/m/mdanohof/HSG2/HighMass/CombinationTool/workspaces/work_v1_800_ggF_NWA.root --dataWorkspace=w_800 --dataName=Data --overwritePOI=mu=0.8 --overwriteBins=mu=2 --overwriteRange=mu=[0.8:1.2] | tee /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihood/Scans/test.txt

/afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihood/BatchProfileLikelihood.py -i /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/CombinationTool/workspace_withAsimov_4l.root -j 1 -c 0 --dataName=asimov_4l --overwritePOI=mu_ggF=0.8 --overwriteBins=mu_ggF=3,mH=800 --overwriteRange=mu_ggF=[0.8:1.2] | tee /afs/cern.ch/work/m/mdanohof/HSG2/HighMass/Limits/HZZllll/BatchProfileLikelihood/Scans/test

Misc hardware

Misc repos etc.

export SVNGRP=svn+ssh://svn.cern.ch/reps/atlasgroups export SVNOFF=svn+ssh://svn.cern.ch/reps/atlasoff export SVNPHYS=svn+ssh://svn.cern.ch/reps/atlasphys export SVNINST=svn+ssh://svn.cern.ch/reps/atlasinst

https://svnweb.cern.ch/cern/wsvn/atlas-mdanohof/mdanohof

https://atlas-svnadmin.cern.ch/

ATLAS note preamble: https://twiki.cern.ch/twiki/bin/view/AtlasProtected/PubComTemplates#ATLAS_LaTeX_package_for_all_ATLA

export SVNGRP=svn+ssh://svn.cern.ch/reps/atlasgroups
export SVNOFF=svn+ssh://svn.cern.ch/reps/atlasoff
export SVNUSR=svn+ssh://svn.cern.ch/reps/atlas-oabouzei
export SVNPHYS=svn+ssh://svn.cern.ch/reps/atlasphys

Edit | Attach | Watch | Print version | History: r74 < r73 < r72 < r71 < r70 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r74 - 2017-05-15 - MassimoDellaPietra
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback