Difference: IFAEAtlasAnalysisTools (1 vs. 2)

Revision 22011-09-14 - HelsensClement

Line: 1 to 1
 
META TOPICPARENT name="IFAEAtlasAnalysisNew"

Revision 12011-09-14 - HelsensClement

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="IFAEAtlasAnalysisNew"

-- HelsensClement - 14-Sep-2011

Histogrammer

Introduction

The Histogrammer is a package originally designed for top analysis but could be extend to other analysis. The purpose of the Histogrammer is to produce histograms of relevant quantities for a given sets of event selection and systematics variations. The complete list of histograms is:

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Histogrammer electron histograms:

  • TH1F* m_hpresel_nb_electron;
  • TH1F* m_hpresel_electron_eta;
  • TH1F* m_hpresel_electron_phi;
  • TH1F* m_hpresel_electron_pT;
  • TH1F* m_hpresel_electron_isMediumEM;
  • TH1F* m_hpresel_electron_isTightEM;
  • TH1F* m_hpresel_electron_ptcone20;
  • TH1F* m_hpresel_electron_ptcone30;
  • TH1F* m_hpresel_electron_etcone20;
  • TH1F* m_hpresel_electron_etcone30;
  • TH1F* m_hpresel_electron2_eta;
  • TH1F* m_hpresel_electron2_phi;
  • TH1F* m_hpresel_electron2_pT;
  • TH1F* m_hpresel_electron2_isMediumEM;
  • TH1F* m_hpresel_electron2_isTightEM;
  • TH1F* m_hpresel_electron2_ptcone20;
  • TH1F* m_hpresel_electron2_ptcone30;
  • TH1F* m_hpresel_electron2_etcone20;
  • TH1F* m_hpresel_electron2_etcone30;
<!--/twistyPlugin-->

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Histogrammer muon histograms:

  • TH1F* m_hpresel_nb_muon;
  • TH1F* m_hpresel_muon_eta;
  • TH1F* m_hpresel_muon_phi;
  • TH1F* m_hpresel_muon_pT;
  • TH1F* m_hpresel_muon_isTight;
  • TH1F* m_hpresel_muon_isMedium;
  • TH1F* m_hpresel_muon_ptcone20;
  • TH1F* m_hpresel_muon_ptcone30;
  • TH1F* m_hpresel_muon_etcone20;
  • TH1F* m_hpresel_muon_etcone30;
  • TH1F* m_hpresel_muon_z0_vx;
  • TH1F* m_hpresel_muon_d0_vx;
  • TH1F* m_hpresel_muon2_eta;
  • TH1F* m_hpresel_muon2_phi;
  • TH1F* m_hpresel_muon2_pT;
  • TH1F* m_hpresel_muon2_isTight;
  • TH1F* m_hpresel_muon2_isMedium;
  • TH1F* m_hpresel_muon2_ptcone20;
  • TH1F* m_hpresel_muon2_ptcone30;
  • TH1F* m_hpresel_muon2_etcone20;
  • TH1F* m_hpresel_muon2_etcone30;
  • TH1F* m_hpresel_muon2_z0_vx;
  • TH1F* m_hpresel_muon2_d0_vx;
<!--/twistyPlugin-->

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Histogrammer jets histograms:

  • TH1F* m_hpresel_jets_in_event;
  • TH1F* m_hpresel_jets25GeV_in_event;
  • TH1F* m_hpresel_jets30GeV_in_event;
  • TH1F* m_hpresel_jet_eta_1;
  • TH1F* m_hpresel_jet_eta_2;
  • TH1F* m_hpresel_jet_eta_3;
  • TH1F* m_hpresel_jet_eta_4;
  • TH1F* m_hpresel_jet_eta_5;
  • TH1F* m_hpresel_jet_eta_6;
  • TH1F* m_hpresel_jet_eta_7;
  • TH1F* m_hpresel_jet_eta_8;
  • TH1F* m_hpresel_jet_eta_9;
  • TH1F* m_hpresel_jet_phi_1;
  • TH1F* m_hpresel_jet_phi_2;
  • TH1F* m_hpresel_jet_phi_3;
  • TH1F* m_hpresel_jet_phi_4;
  • TH1F* m_hpresel_jet_phi_5;
  • TH1F* m_hpresel_jet_phi_6;
  • TH1F* m_hpresel_jet_phi_7;
  • TH1F* m_hpresel_jet_phi_8;
  • TH1F* m_hpresel_jet_phi_9;
  • TH1F* m_hpresel_jet_pT_1;
  • TH1F* m_hpresel_jet_pT_2;
  • TH1F* m_hpresel_jet_pT_3;
  • TH1F* m_hpresel_jet_pT_4;
  • TH1F* m_hpresel_jet_pT_5;
  • TH1F* m_hpresel_jet_pT_6;
  • TH1F* m_hpresel_jet_pT_7;
  • TH1F* m_hpresel_jet_pT_8;
  • TH1F* m_hpresel_jet_pT_9;
  • TH1F* m_hpresel_jet_mass_1;
  • TH1F* m_hpresel_jet_mass_2;
  • TH1F* m_hpresel_jet_mass_3;
  • TH1F* m_hpresel_jet_mass_4;
  • TH1F* m_hpresel_jet_mass_5;
  • TH1F* m_hpresel_jet_mass_6;
  • TH1F* m_hpresel_jet_mass_7;
  • TH1F* m_hpresel_jet_mass_8;
  • TH1F* m_hpresel_jet_mass_9;
  • TH1F* m_hpresel_jet_MinDRJetJet;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_SV1_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_SV1_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_SV1_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_SV1_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_SV1_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_SV1_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_TrackCounting2D_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_TrackCounting2D_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_TrackCounting2D_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_TrackCounting2D_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_TrackCounting2D_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_TrackCounting2D_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_JetProb_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_JetProb_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_JetProb_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_JetProb_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_JetProb_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_JetProb_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP1D_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP1D_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP1D_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP1D_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP1D_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP1D_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP2D_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP2D_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP2D_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP2D_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP2D_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP2D_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_IP3D_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV0_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV0_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV0_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV0_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV0_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV0_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV1_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV1_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV1_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV1_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV1_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV1_6;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV2_1;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV2_2;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV2_3;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV2_4;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV2_5;
  • TH1F* m_hpresel_jet_flavorTagWeight_SV2_6;
  • TH1F* m_hpresel_jet_n90constituents_1;
  • TH1F* m_hpresel_jet_n90constituents_2;
  • TH1F* m_hpresel_jet_n90constituents_3;
  • TH1F* m_hpresel_jet_n90constituents_4;
  • TH1F* m_hpresel_jet_n90constituents_5;
  • TH1F* m_hpresel_jet_n90constituents_6;
  • TH1F* m_hpresel_jet_emf_1;
  • TH1F* m_hpresel_jet_emf_2;
  • TH1F* m_hpresel_jet_emf_3;
  • TH1F* m_hpresel_jet_emf_4;
  • TH1F* m_hpresel_jet_emf_5;
  • TH1F* m_hpresel_jet_emf_6;
  • TH1F* m_hpresel_jet_IsBB;
  • TH1F* m_hpresel_jet_IsCC;
  • TH1F* m_hpresel_jet_IsC;
  • TH1F* m_hpresel_jet_IsLight;

  • TH1F* m_hDPhiJet1_presel_missingET_missET;
  • TH1F* m_hDPhiJet2_presel_missingET_missET;
  • TH1F* m_hDPhiJet3_presel_missingET_missET;
  • TH1F* m_hDPhiJet4_presel_missingET_missET;
  • TH1F* m_hDPhiJet5_presel_missingET_missET;
  • TH1F* m_hDPhiJet6_presel_missingET_missET;
  • TH1F* m_hDPhiJet7_presel_missingET_missET;
  • TH1F* m_hDPhiJet8_presel_missingET_missET;
  • TH1F* m_hDPhiJet9_presel_missingET_missET;
  • TH1F* m_hDPhiJet1_topo_missingET_missET;
  • TH1F* m_hDPhiJet2_topo_missingET_missET;
  • TH1F* m_hDPhiJet3_topo_missingET_missET;
  • TH1F* m_hDPhiJet4_topo_missingET_missET;
  • TH1F* m_hDPhiJet5_topo_missingET_missET;
  • TH1F* m_hDPhiJet6_topo_missingET_missET;
  • TH1F* m_hDPhiJet7_topo_missingET_missET;
  • TH1F* m_hDPhiJet8_topo_missingET_missET;
  • TH1F* m_hDPhiJet9_topo_missingET_missET;
<!--/twistyPlugin-->

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Histogrammer topological histograms:

  • TH1F* m_hpresel_missingET_missET;
  • TH1F* m_hpresel_missingET_missET_x;
  • TH1F* m_hpresel_missingET_missET_y;
  • TH1F* m_hpresel_missingET_missET_phi;

  • TH1F* m_hpresel_missingETspecialElectron_missET;
  • TH1F* m_hpresel_missingETspecialMuon_missET;
  • TH1F* m_hpresel_missingETspecialElectronJet_missET;
  • TH1F* m_hpresel_missingETspecialMuonJet_missET;

  • TH1F* m_htopo_missingET_missET;
  • TH1F* m_htopo_missingET_missET_x;
  • TH1F* m_htopo_missingET_missET_y;
  • TH1F* m_htopo_missingET_missET_phi;

  • TH1F* m_hfinal_missingET_missET;
  • TH1F* m_hfinal_missingET_missET_x;
  • TH1F* m_hfinal_missingET_missET_y;
  • TH1F* m_hfinal_missingET_missET_phi;

  • TH1F* m_hDPhiElectron_presel_missingET_missET;
  • TH1F* m_hDPhiMuon_presel_missingET_missET;
  • TH1F* m_hDPhiElectron_topo_missingET_missET;
  • TH1F* m_hDPhiMuon_topo_missingET_missET;
  • TH1F* m_hDPhiElectron2_presel_missingET_missET;
  • TH1F* m_hDPhiMuon2_presel_missingET_missET;
  • TH1F* m_hDPhiElectron2_topo_missingET_missET;
  • TH1F* m_hDPhiMuon2_topo_missingET_missET;

  • TH1F* m_hhad_W_bestPDGWMass_Mass;
  • TH1F* m_hhad_W_highestDaughtersPt_Mass;
  • TH1F* m_hhad_W_highestWpT_Mass;
  • TH1F* m_hhad_W_closestDRJets_Mass;
  • TH1F* m_hhad_W_bestPDGWMasswith3_Mass;
  • TH1F* m_hhad_W_bestPDGWMassselecJets_Mass;
  • TH1F* m_hhad_W_closestDRJetswith3_Mass;
  • TH1F* m_hhad_W_closestDRJetsselecJets_Mass;
  • TH1F* m_hhad_W_highestWpTwith3_Mass;
  • TH1F* m_hhad_W_highestWpTselecJets_Mass;
  • TH1F* m_hhad_W_highesttoppTwith3_nobjet_Mass;
  • TH1F* m_hhad_W_bestPDGWMass_nobjet_Mass;

  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_PtCut30_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_PtCut30_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_PtCut40_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_PtCut40_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_PtCut50_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjet_PtCut50_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_PtCut30_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_PtCut30_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_PtCut40_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_PtCut40_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_PtCut50_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjetLooseTag_PtCut50_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjethighpT_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_bestPDGTopMass_bjethighpT_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_TopMass_highpTbjet_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_TopMass_highpTbjet_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_highpTTopMass_bjet_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_highpTTopMass_bjet_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_TopMass_highpTbjetLooseTag_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_TopMass_highpTbjetLooseTag_StabMass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_highpTTopMass_bjetLooseTag_Mass;
  • TH1F* m_hhad_top_bestPDGWMass_nobjet_highpTTopMass_bjetLooseTag_StabMass;
  • TH1F* m_hhad_top_highesttoppTwith3_MassT;
  • TH1F* m_hhad_top_closestDRJets_Mass;
  • TH1F* m_hhad_top_highesttoppT_Mass;
  • TH1F* m_hhad_top_highesttoppTselecJets_Mass;
  • TH1F* m_hhad_top_highesttoppTwith3_Mass;
  • TH1F* m_hhad_top_highesttoppTwith3_bjet_Mass;
  • TH1F* m_hhad_top_highesttoppTwith3_Mass_4LeadingJets;
  • TH1F* m_hhad_top_highesttoppTwith3_Mass_JetPtCut_20;
  • TH1F* m_hhad_top_highesttoppTwith3_Mass_JetPtCut_25;
  • TH1F* m_hhad_top_highesttoppTwith3_Mass_JetPtCut_30;
  • TH1F* m_hhad_top_highesttoppTwith3_Mass_JetPtCut_40;
  • TH1F* m_hhad_top_highesttoppTselecJets_StabMass;
  • TH1F* m_hhad_top_highesttoppTwith3_StabMass;
  • TH1F* m_hhad_top_highesttoppTselecJets_pTStabMass;
  • TH1F* m_hhad_top_highesttoppTwith3_pTStabMass;
  • TH1F* m_hhad_top_highesttoppTselecJets_pT3StabMass;
  • TH1F* m_hhad_top_highesttoppTwith3_pT3StabMass;
  • TH1F* m_hhad_top_highesttoppTwith3_bjet_StabMass;
  • TH1F* m_hhad_top_highesttoppTwith3_bjetHybrid_StabMass;
  • TH1F* m_hhad_top_closestDRJets_pT;
  • TH1F* m_hhad_top_highesttoppT_pT;
  • TH1F* m_hhad_top_highesttoppTselecJets_pT;
  • TH1F* m_hhad_top_highesttoppTwith3_pT;
  • TH1F* m_hhad_top_closestDRJets_eta;
  • TH1F* m_hhad_top_highesttoppT_eta;
  • TH1F* m_hhad_top_highesttoppTselecJets_eta;
  • TH1F* m_hhad_top_highesttoppTwith3_eta;

  • TH1F* m_hlep_top_highesttoppTwith3_Mass;
  • TH1F* m_hlep_top_highesttoppTwith3_MassT;
  • TH1F* m_hlep_top_highesttoppTmuonwith3_pT;
  • TH1F* m_hlep_top_highesttoppTmuonwith3_Mass;
  • TH1F* m_hlep_top_highesttoppTmuonwith3_MassT;
  • TH1F* m_hlep_top_highesttoppTselecJets_pT;
  • TH1F* m_hlep_top_highesttoppTselecJets_Mass;
  • TH1F* m_hlep_top_highesttoppTselecJets_MassT;
  • TH1F* m_hlep_top_highesttoppTmuonselecJets_pT;
  • TH1F* m_hlep_top_highesttoppTmuonselecJets_Mass;
  • TH1F* m_hlep_top_highesttoppTmuonselecJets_MassT;

  • TH1F* m_hpresel_electron_minDeltaRJets;
  • TH1F* m_hpresel_muon_minDeltaRJets;
  • TH1F* m_hpresel_electron2_minDeltaRJets;
  • TH1F* m_hpresel_muon2_minDeltaRJets;

  • TH1F* m_helectron_Z_Mass;
  • TH1F* m_helectron_Z_MassT;
  • TH1F* m_helectron_Z_pT;
  • TH1F* m_helectron_Z_eta;
  • TH1F* m_helectron_Z_phi;
  • TH1F* m_helectron_Z_E;
  • TH1F* m_helectron_Wlep_MassT;

  • TH1F* m_hmuon_Z_Mass;
  • TH1F* m_hmuon_Z_MassT;
  • TH1F* m_hmuon_Z_pT;
  • TH1F* m_hmuon_Z_eta;
  • TH1F* m_hmuon_Z_phi;
  • TH1F* m_hmuon_Z_E;
  • TH1F* m_hmuon_Wlep_MassT;

  • TH1F* m_hEvtVar_HT;
  • TH1F* m_hEvtVar_HTAll_ele;
  • TH1F* m_hEvtVar_HTAll_mu;
  • TH1F* m_hEvtVar_HT2_ele;
  • TH1F* m_hEvtVar_HT3_ele;
  • TH1F* m_hEvtVar_HT2_mu;
  • TH1F* m_hEvtVar_HT3_mu;
  • TH1F* m_hEvtVar_HTLep_ele;
  • TH1F* m_hEvtVar_HTLep_mu;
  • TH1F* m_hEvtVar_HTSpecialLep_mu;
  • TH1F* m_hEvtVar_HTSpecialLep_ele;
  • TH1F* m_hEvtVar_HTSpecialLep_mujet;
  • TH1F* m_hEvtVar_HTSpecialLep_elejet;
<!--/twistyPlugin-->

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Histogrammer vertex histograms:

  • TH1F* m_hPVertex_vertexs_in_event;
  • TH1F* m_hPVertex_vertex_x;
  • TH1F* m_hPVertex_vertex_y;
  • TH1F* m_hPVertex_vertex_z;
  • TH1F* m_hPVertex_vertex_nTracks;
  • TH1F* m_hSVertex_vertexs_in_event;
  • TH1F* m_hSVertex_vertex_x;
  • TH1F* m_hSVertex_vertex_y;
  • TH1F* m_hSVertex_vertex_z;
  • TH1F* m_hSVertex_vertex_nTracks;
<!--/twistyPlugin-->

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Histogrammer maps:

  • TH2F* m_map_DPhiEleMET_HTEle;
  • TH2F* m_map_DPhiMuMET_HTMu;
  • TH2F* m_map_WMTMu_MET;
  • TH2F* m_map_WMTEle_MET;
  • TH2F* m_map_DPhiEleMET_MET;
  • TH2F* m_map_DPhiMuMET_MET;
  • TH2F* m_map_WMTMu_HTMu;
  • TH2F* m_map_WMTEle_HTEle;
  • TH2F* m_map_highesttoppTwith3Mass_HTAllmu;
  • TH2F* m_map_highesttoppTwith3Mass_HTAllele;
  • TH2F* m_map_pt_eta_taggedjet;
  • TH2F* m_map_pt_eta_untaggedjet;
<!--/twistyPlugin-->

Check-out the Histogrammer

Download the Histogrammer from SVN. Please note that Histogrammer has dependencies on tools and xsec package headers. The best is to download the full set of AnaTools:

  • export SVNROOT=svn+ssh://YOUR_AFS_USER_NAME_HERE@svn.cern.ch/reps/IfaeAnaRepo
  • svn co $SVNROOT/IFAEanalysis/AnaTools/Histogrammer/tags/Histogrammer-XX-YY-ZZ

Now check that your ROOT version and gcc are compatibles by comparing:

  • echo $ROOTSYS
  • gcc -v

Check-out a tagged version of Histogrammer

Go to your Histogrammer directory and do:
  • export SVNROOT=svn+ssh://YOUR_AFS_USER_NAME_HERE@svn.cern.ch/reps/IfaeAnaRepo
  • svn co $SVNROOT/IFAEanalysis/AnaTools

Compile the Histogrammer

Now you can go to the Histogrammer directory and compile:

  • Path and filenames of the input files are still hard-coded so far. If you run on atlasui, default are fine. If you run on different machines, update the files src/Histogrammer.cxx and src/runHistogrammer.cxx
  • cd AnaTools/Histogrammer/
  • first time on a new shell source setup.sh
  • next time (if re-compilation is needed) make

Run the Histogrammer

If the compilation when fine, you are ready to launch the Histogrammer.

  • Copy/Edit the files in directory share/FilesList/ that contains the list of data samples and cuts to use; and share/JobOptions that contains the list cuts/JobOptions to process.
  • IMPORTANT: the parsing of an option fails when it ends with a space character!
  • You will need to add the lib directory to the LD_LIBRARY_PATH. For bash/zsh: export LD_LIBRARY_PATH=_PATH_TO_HISTOGRAMMER_HERE_/lib:$LB_LIBRARY_PATH (Normaly done in setup.sh)
  • To run on data (Standard JobOptions):
    • Muon ./bin/main share/FilesList/FL_DATA_MUON.txt share/JobOptions/JO_MUON_Data.txt share/Trigger/RunTriggerList_MuonData.txt
    • Electron ./bin/main share/FilesList/FL_DATA_ELE.txt share/JobOptions/JO_ELE_Data.txt share/Trigger/RunTriggerList_ElectronData.txt
  • To run on MonteCarlo (Standard JobOptions):
    • Muon ./bin/main share/FilesList/FL_SM_MC_MUON.txt share/JobOptions/JO_MUON_MC.txt share/Trigger/RunTriggerList_MuonMC.txt
    • Electron ./bin/main share/FilesList/FL_SM_MC_ELE.txt share/JobOptions/JO_ELE_MC.txt share/Trigger/RunTriggerList_ElectronMC.txt

The output of the Histogrammer is a ROOT file for each dataset in input containing the histograms for the cuts defined in the JobOptions file.
The Histogrammer runs on PROOF by default. If you want to run locally (then it will run only 10 000 events) you have to set the flag int runLocal  = 1; in src/runHistogrammer.cxx.
IMPORTANT: My advise is to let all the default files (share/FilesList and share/JobOptions) as they are in SVN. So create your own subdirectory in share/ like share/MyStuff and Copy/Edit the files there.

Histogrammer default trigger list

The default list of triggers to be used by the Histogrammer are in the subdirectory share/Trigger. They must not be changed without extreme caution. Files are:

  • RunTriggerList_ElectronData.txt has to be use to run Data for the Electron channel
  • RunTriggerList_MuonData.txt has to be use to run Data for the Muon channel
  • RunTriggerList_ElectronMC.txt has to be use to run MC for the Electron channel
  • RunTriggerList_MuonMC.txt has to be use to run MC for the Muon channel

Histogrammer default files list

The default list of files to be used by the Histogrammer are in the subdirectory share/FilesList:

  • FL_DATA_ELE.txt has to be use to run on data for the Electron channel
  • FL_DATA_MUON.txt has to be use to run on data for the Muon channel
  • FL_SM_MC_ELE.txt contains the full MonteCarlo datatset list for the Electron channel
  • FL_SM_MC_MUON.txt contains the full MonteCarlo datatset list for the Muon channel

The names of the files contains _MUON or _ELE to distinguish the various output datasets. The names have to be used like in the files to allow the data processing.

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of MonteCarlo datasets:
  • ttbar5200 (MC@NLO, Herwig/Jimmy)
  • ttbar5861 (POWHEG, Pythia )
  • ttbar5205 (AcerMC, Pythia)
  • ttbar160GeV (MC@NLO, Herwig): ttbar mass of 160GeV
  • ttbar165GeV (MC@NLO, Herwig): ttbar mass of 165GeV
  • ttbar167GeV (MC@NLO, Herwig): ttbar mass of 167.5GeV
  • ttbar170GeV (MC@NLO, Herwig): ttbar mass of 170GeV
  • ttbar175GeV (MC@NLO, Herwig): ttbar mass of 175GeV
  • ttbar177.5GeV (MC@NLO, Herwig): ttbar mass of 177.5GeV
  • ttbar180GeV (MC@NLO, Herwig): ttbar mass of 180GeV
  • ttbar190GeV (MC@NLO, Herwig): ttbar mass of 190GeV
  • singleTop (MC@NLO, Herwig)
  • Zjets (Alpgen,Herwig/Jimmy)
  • ZjetsSherpa (Sherpa)
  • Wjetsall (Alpgen,Herwig/Jimmy): Wjets+Wbb with DR cut
  • WjetsSherpa (Sherpa)
  • Wjets_HFOR: Wjets+Wbb without DR cut+Wcc+Wc
  • Dibosons: WW+WZ+ZZ
  • TopMixingele
  • TopMixingQCDele
  • QCDele
  • ttbarISRdown (AcerMC, Pythia)
  • ttbarISRup (AcerMC, Pythia)
  • ttbarFSRdown (AcerMC, Pythia)
  • ttbarFSRup (AcerMC, Pythia)
<!--/twistyPlugin-->

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Dataele datasets:
  • Dataele_ELE (to run all the full statistic)
  • DataelePer_AtoD_ELE (to run period A to D only)
  • DataelePer_E_ELE (to run period E only)
  • DataelePer_F1_ELE (to run period F1 only)
  • DataelePer_F2_ELE (to run period F2 only)
<!--/twistyPlugin-->

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of Datamu datasets:
  • Datamu_MUON (to run all the full statistic)
  • DatamuPer_AtoD_MUON (to run period A to D only)
  • DatamuPer_E_MUON (to run period E only)
  • DatamuPer_F1_MUON (to run period F1 only)
  • DatamuPer_F2_MUON (to run period F2 only)
<!--/twistyPlugin-->

Histogrammer default JobOptions

The default list of JobOptions to be used by the Histogrammer are in the subdirectory share/JobOptions:

  • JO_ELE_Data.txt Contains standard cuts to run on data for the Electron channel (No scale factors)
  • JO_MUON_Data.txt Contains standard cuts to run on data for the Muon channel (No scale factors)
  • JO_ELE_MC.txt Contains standard cuts to run on MC for the Electron channel (With scale factors)
  • JO_MUON_MC.txt Contains standard cuts to run on MC for the Muon channel (With scale factors)
  • JO_ELE_DataMCComp_Data.txt Contains standard cuts to run Data/MC comparison on Data for the Electron channel (No scale factors)
  • JO_MUON_DataMCComp_Data.txt Contains standard cuts to run Data/MC comparison on Data for the Muon channel (No scale factors)
  • JO_ELE_DataMCComp_MC.txt Contains standard cuts to run Data/MC comparison on MC for the Electron channel (With scale factors)
  • JO_MUON_DataMCComp_MC.txt Contains standard cuts to run Data/MC comparison on MC for the Muon channel (With scale factors)
  • JO_ELE_SystVar.txt Contains standard cuts to run MC full systematics variations for the Electron channel (With scale factors)
  • JO_MUON_SystVar.txt Contains standard cuts to run MC full systematics variations for the Muon channel (With scale factors)

Histogrammer Possibilities

This section summarize the various and the most up to date possibilities of the Histogrammer

Histogrammer Scale Factors/Smearing/Topinputs

IMPORTANT The Histogrammer must have those four flags at the beginning, if not the flags will be set to false
  • Scale factors:
    • SFMuon true for monte carlo false for data
    • SFElectron true for monte carlo false for data
  • Lepton smearing
    • RescaleMuonpt false for the moment has to be false
    • RescaleElectronpt false noting done for electron smearing
  • Topinputs
    • TopInputs true to run using TopInputs leptons (ti); for the jets the default is TopInputs; false to run using Preselected leptons presel *

Histogrammer Systematics

  • Wjets shape systematics, from Anna Henrichs code

<!--/twistyPlugin twikiMakeVisibleInline-->
Full list Wjets shape systematics (the value, here 0 means weight=(value+1)*weight):
  • Systematics Wjets_iqopt2 0
  • Systematics Wjets_iqopt3 0
  • Systematics Wjets_ktfac05 0
  • Systematics Wjets_ktfac20 0
  • Systematics Wjets_perugia_hard 0
  • Systematics Wjets_perugia 0
  • Systematics Wjets_perugia_soft 0
  • Systematics Wjets_ptjmin10 0
  • Systematics Wjets_ptjmin20 0
  • Systematics Wjets_qfac05 0
  • Systematics Wjets_qfac20 0
<!--/twistyPlugin-->


  • Jet Energy Scale systematic (JES)
<!--/twistyPlugin twikiMakeVisibleInline-->
Full list JES systematics:
  • Systematics JES_jetetmiss 1 from the JetEtMiss group link (the value, here 1 means weight=value*weight)
  • Systematics JES 1 (the value, here 1 means weight=value)
<!--/twistyPlugin-->


  • Tagging systematics
<!--/twistyPlugin twikiMakeVisibleInline-->
Full list of tagging systematics, per jet weighting method :
  • pessimistic
  • Systematics Btag 0.15 (the value, here 0.15 means SF=(SF+value))
  • Systematics Ctag 0.3
  • Systematics Ctag 1
  • optimistc
  • Systematics Btag 0.075
  • Systematics Ctag 0.15
  • Systematics Ctag 0.5
<!--/twistyPlugin-->

Histogrammer JetLepCuts

The JetLepCuts line has to be configured like this

JetLepCuts jet 4 in 20 20 20 20 elec 1 ex 20 topcommon etacommon muon 0 ex 20 topcommon etacommon MET 20 MTW_MET 60 -1

The jet in details:

  • jet 4 in 20 20 20 20 means 4 jet inclusive with pT>20GeV. For other pT cut, please oder the cut by increasing order (40 30 25 20 for exemple)
  • jet 3 ex 20 20 20 20 means 3 jet exclusive with pT>20GeV.
  • IMPORTANT If you want 0 jet inclusive, the pT cut has to be specified: jet 0 in 20

The elec in details:

  • elec 1 ex 20 topcommon etacommon means 1 electron exclusive > 20GeV; topcommon etacommon
  • For 2 electrons of the same quality: elec 2 ex 20 topcommon etacommon 20 topcommon etacommon

<!--/twistyPlugin twikiMakeVisibleInline-->
All possible electron quality:

  • loose: isLooseEM+electron_author+egamma_quality
  • loose_reliso: loose + electron_etcone20Isolation
  • medium: isMediumEM+electron_author+egamma_quality
  • ifae_MM_loose; medium_reliso: =medium=+electron_etcone20Isolation
  • tight: isTightEM+electron_author+egamma_quality
  • tight_reliso tight+electron_etcone20Isolation
  • looseNotTight:=medium_reliso= + (isTightEM)
  • topcommon: see TopCommonObjects

All possible electron eta:

  • etacommon: 0<|eta(cluster)|<2.47 excluding 1.37<|eta(cluster)|<1.52
  • etacentral: |eta|<1.37
<!--/twistyPlugin-->

The muon in details:

  • muon 1 ex 20 topcommon etacommon means 1 muon exclusive > 20GeV; topcommon etacommon
  • For 2 muons of the same quality: muon 2 ex 20 topcommon etacommon 20 topcommon etacommon

<!--/twistyPlugin twikiMakeVisibleInline-->
All possible muon quality:

  • medium: isMedium
  • medium_dr: =medium=+muon_minDeltaRJets>0.4
  • tight: isTight
  • tight_dr: =tight=+muon_minDeltaRJets>0.4
  • tight_ptiso: =tight=+muon_ptcone30Isolation
  • tight_ptiso_etreliso: =tight_ptiso=+muon_etcone20Isolation
  • topcommon_MM_loose: =tight_dr=+isCombinedMuon
  • verylooseNotTight: isTight+isCombinedMuon+!(muon_etcone30Isolation+muon_ptcone30Isolation)
  • looseNotTight: topcommon_MM_loose + !(muon_etcone30Isolation+muon_ptcone30Isolation)
  • topcommon: see TopCommonObjects

All possible muon eta:

  • etacommon: |eta|<2.5
  • etacentral: |eta|<1.5
<!--/twistyPlugin-->

The TopCommon definition can be found here

The MET in details:

The MET cut is the first of the two additional cuts:

  • MET 20 means Missing ET>20GeV
  • MET 0 means no cut on MET and has to be here if no MET cut is required

The additional cut:

In addition to the MET cut an additional cut is possible. If no additional cut is required, an item is necessary like for exemple

HTAll 0
All the possible additional cuts are here
<!--/twistyPlugin twikiMakeVisibleInline-->
All possible additional cuts:

  • HTAll 100
  • HT2 40
  • HT3 50
  • HT 60
  • HTlep 30
  • HTSpeciallep 60
  • HTSpeciallepjet 50
  • MTW 40
  • METspecialLep 60
  • METspecialLepJet 70
  • Dphi_MET 0.33 20 IMPORTANT takes two arguments
  • MTW_MET 60 -1 IMPORTANT takes two arguments

<!--/twistyPlugin-->

IMPORTANT For the MET and the additional cuts, there is the possibility to revert the cut for exemple

MET 20
means MET>20GeV
-MET 20
means MET<20GeV

Histogrammer Btag cuts

The b-tagging cuts are organized in the following way,

BtagCuts 1 in 5.72 SV0
means 1 jet inclusive with SV0 tagger weight > 5.72. The list of jets with the pT cut specified in the jet line are used. For exemple, if you have the jet line:
jet 4 in 25 25 25 25 
you will look for b-jets only with jets with a pT>25GeV. If you want the untagged case, please put this line
BtagCuts 0 in

The complete list of Taggers and OP can be found here

Histogrammer Extra features

Extra features of the Histogrammer are available

Orthogonal Method (OM)

0 to 1 tag (0to1)

Matrix Method (MM)

Go here

Tagging Rate Function (OM_TRF)

Limitator mclimit

Limit seting tools based on Tom Junk mclimit code. Things to know -> to be compile with a recent version of root (lets say 5.26) Then have to be run like this : ./mclimit JobOption.txt In the JobOption there are several things that differs from the fitter, but lets start with what is almost the same (there is no flag for nuisance factor)

  • PathToHistogramNtuple CH path*
  • NominalOn CH samples (here all the samples have to be written) different for ELE and MUON because of QCD. All the samples should be written because for systematics, all the samples are needed in the same time (for modeling systematics) * (for ELE CH ttbar Wjets WjetsSherpa Zjets singleTop DataEle QCDele QCDeleMM dibosons Tprime300 ttbar_POWHEG_PYTHIA ttbar_POWHEG_HERWIG ttbar_AcerMC_ISRup ttbar_AcerMC_ISRdown ttbar_AcerMC_FSRup ttbar_AcerMC_FSRdown ttbar_AcerMC) * (for MUON CH ttbar_AcerMC_FSRup ttbar_AcerMC_FSRdown ttbar_AcerMC NominalOn MUON2ex_0ex  ttbar Wjets WjetsSherpa Zjets singleTop DataMu QCDmu QCDmuMM dibosons Tprime300 ttbar_POWHEG_PYTHIA ttbar_POWHEG_HERWIG ttbar_AcerMC_ISRup ttbar_AcerMC_ISRdown ttbar_AcerMC_FSRup ttbar_AcerMC_FSRdown ttbar_AcerMC)
  • plotThisVariable CH Variable
  • RebinX CH 2
  • RebinY CH 2
  • nTupleDir CH dir
  • nTupleDirQCD CH dir (for QCD shape)
  • nTupleDirQCDMM CH dir (for QCD nomarlization)
  • Normalization systematics (Lumi, ttbarXS, WjetsXS, ZjetsXS, DibosonsXS, singleTopXS, QCDeleXS, QCDmuXS)
  • Norma/Shape (JES, modeling (same up/down) Wjets sherpa, ttbar frag, ttbarISR, ttbarFSR, BCTag, HF... )
  • CombineChannels CHtoCombine -> will loop over the channels here, will not perform any channel per channel analysis
  • RunOnData false/true -> to run data
  • UseUnrolling true/false -> to use unrolling on 2D histos
  • DealWithNegBins true/false -> to remove negative bins (will set them to 0 and rescale the histo accordingly)
  • MinosFlag 0/1 -> obsolete, let it to 0
  • MinosMaxCall 1000 -> obsolete, minos is not called
  • MinuitMaxCall 1000 max call of minuit
  • MinNbOfBins 10 -> Minimum number of bins for the unrolling method
  • DeltaBoverB 0.3 -> for the unrolling schema
  • LuminosityExp 35 -> Lumi to run the exp sensitivity
  • LuminosityData 35 -> Lumi to renormalize data or data driven BG (QCD)
  • PoissonFlag 2 -> taken into account the errors in the limits settings
  • NPseudoExperiments 2000 -> number of pse to run
  • MinuitPrintFlag false -> print muinuit info
  • Profiling true/false -> to run the LLR with profiling
  • AnalysisToDo Limit/CL/Limit_B (frequentist limits; frequentist confidence levels; Bayesian limits)

So if you want to create your own JO with 20 channels and 20 systematics, good luck! To avoid this problem, there is a small JOMaker.py. The file Channels.py contains all the channels (SEL1/2/3; 0tagin/0tagex/1tagin; 4jetin/3jetex; 3jetin/2jetex; 602020; 602525; 602525MET30) for the normal samples, QCD, QCDMM. with names that you could almost understand (I hope!). This code is not yet optimized. Few things to improve (because you'll have to change them either in the created JO or in the JOMaker.py)

  • The created JO name contains the CH and the systematics name. If too long the name become JO_withFileNameTooLong.txt
  • At the end of JOMaker.py, all the flags are "hard coded" so you have to change them here or to improve the code
  • The Code is not very flexible, but is very helpfull to create a long JO
  • The code needs two samples list SampleListELE SampleListMUON, that shouldn't change (contains all the samples)
  • Then, the SystematicList subset of SampleFullList
  • The ChannelList give the names you want (my advise is to use the same that I defined in Channels.py)
  • NomDirList list the dir in the ntuples for normal samples (for exemple CH.ELE1_0ex CH being here coming from import Channels as CH)
  • NomDirListQCD list the dir in the ntuple for QCD shape (for exemple CH.ELE1QCD_0ex )
  • NomDirListQCDMM list the dir in the ntuple for QCD yields (for exemple CH.ELE1QCDMM_0ex )
  • VariableToPlot list the variables to plots defined in the code
    • Chi2Mass="had_top_SemiLepChi2_Mass"
    • Chi2MapELE="map_had_top_SemiLepChi2_Mass_HTAllele"
    • Chi2MapMUON="map_had_top_SemiLepChi2_Mass_HTAllmu"
    • SEL1Mass="hhad_tprime_highesttprimepTwithMergedWhad_Mass"
    • MapELE2="map_highesttprimepTwithWhadMass_HTAllele"
    • MapMUON2="map_highesttprimepTwithWhadMass_HTAllmu"
    • MapELE3="map_highesttoppTwith3Mass_HTAllele"
    • MapMUON3="map_highesttoppTwith3Mass_HTAllmu"
  • AnalysisMode (BB ; Light ; Democratic) -> Define the analysis mode (reading the file TprimeJO_TEST.txt)

WARNINGS.....

  • The RebinX and RebinY are not implemented in the JOMaker.py, so you will not have any rebining (so either add it yourself or )
  • The various directories you want to run on (for the various analysis modes) for a given production date
  • The default number of pse is set to 100 in the JOMaker.py

Plotter

You can find the Plotter as part of the AnaTools package in SVN

To run the plotter: ./Plotter.py
Make sure that PyROOT is installed on your PC or that you set up your working area correctly. For the moment on the atlasui you need to change the setup script from

source /nfs/pic.es/tier2/scratch/AtlasSoftware/etc/profile.d/grid-env.sh
to
source /nfs/pic.es/tier2/scratch/AtlasSoftware/etc/profile.d/grid-env_gcc4.3.sh 

Job option file: plotter_JO.py
IMPORTANT: the parsing of an option fails when it ends with a space character! In this file you can specify the ListOfFiles with the samples you want to use (note: histograms will be ordered accordingly to their integral) and their path. plotFilesTypes refers to the type of the output plot. To make the web page working, at least .png should be in the list. Specify the lumi you want to normalize to or if you want to scaleToData.

Fitter

IMPORTANT: use ROOT 5.26. For ubuntu you can find here how to install it.

You can find the Fitter as part of the AnaTools package in SVN

Compilation:

make clean && make

To run:

./mainFitter <confFile> <debug mode>
The Fitter accepts two arguments, both optional. The first is the name of the config file / job option and the second one switches on/off the debug output.

Job specification:
IMPORTANT: the parsing of an option fails when it ends with a space character! For instance with

grep " $" share/myJobOption.txt

Several configuration files can be found in the /share directory. Here you can specify the origin of your histograms (by default Histogrammer output ntuples, kept at /nfs/at3/projects/FLIPA/output_Histogrammer), the selection directory within the directory (for not Histogrammer output ntuples "./"), a possible rebinning, the fit type, the number of experiments, and also the luminosity you want to scale to. There is the option (orderByIntg) to order the samples within the stack according to their integral.
In the file fileNameDefs.h one can specify the file names for the different sample types, in case you don't use the Histogrammer output ntuples.

Output: In the /output directory you will get a LaTeX file YOURJO_stats.tex which contains the same table as the std output. You will get as well the values of the fitted parameters in the file YOURJO.root. In the /plots directory you will get a collection of .png plots.

Matrix Method

The prediction of QCD with Matrix Method is done in the Iterator tool. Refer to my scripts in /nfs/at3users/users/succurro/06QCD_MM/AnaTools/Histogrammer/trunk. N.B. you'll need directories like [trunk]/../output/trunk/MUON [trunk]/../output/trunk/eff [trunk]/../output/trunk/eff/untagged [trunk]/../output/trunk/eff/tagged
  • First run the Histogrammer like stated in run_eff_muon.sh
[succurro@at307 trunk]$ more run_eff_muon.sh 
bin/main share/FilesList/my_mm/FL_DATA_MUON.txt share/JobOptions/eff/JO_MUON_Data.txt share/Trigger/RunTriggerList_MuonData2011.txt
bin/main share/FilesList/my_mm/FL_DATA_MM_MUON.txt share/JobOptions/eff/JO_MUON_Data.txt share/Trigger/RunTriggerList_MuonData2011.txt
bin/main share/FilesList/my_mm/FL_MC_MUON_w.txt share/JobOptions/eff/JO_MUON_MC.txt share/Trigger/RunTriggerList_MuonData2011.txt
bin/main share/FilesList/my_mm/FL_MC_MUON_z.txt share/JobOptions/eff/JO_MUON_MC.txt share/Trigger/RunTriggerList_MuonData2011.txt
bin/main share/FilesList/my_mm/FL_MC_MUON_t.txt share/JobOptions/eff/JO_MUON_MC.txt share/Trigger/RunTriggerList_MuonData2011.txt
bin/main share/FilesList/my_mm/FL_MC_MUON_ttbar5200.txt share/JobOptions/eff/JO_MUON_MC.txt share/Trigger/RunTriggerList_MuonData2011.txt
mv *.root ../output/trunk/eff
  • make sure the files are in a directory like [trunk]/../output/trunk/eff
  • go to AnaTools/Iterator
  • do make and run ./runIterate like these
Syntax: ./runIterate ntuples ( D3PD2011 / D3PD2010 / AOD ) Channel (  muon / ele ) tag ( 0btagin5.85SV0 / 1btagin5.85SV0 ) region ( -MET10 / MET_5_15) 
  • e.g. ./runIterate D3PD2011 muon 0btagin5.85SV0 MET_5_15 will save the MM_effs_muon_fake.root and MM_effs_muon_real.root (obtained in the control region 5GeV<MET<15GeV) in the [trunk]/../output/trunk/eff/untagged dir together with png files

If you now move the root files in the official place where the Histogrammer takes the eff files from, i.e. [trunk]/share/EffMM/eff_untagged and [trunk]/share/EffMM/eff_tagged (N.B. MM_effs_muon_real.root untagged is used also for tagged!), you will already be able to run the QCD prediction for the simple eta dependence by adding in the JO the line

UseFullQCDDep false

If you want the full parametrization the steps to follow are:

  • run a sample with only eta dependence like in my script [trunk]/eff_eta_only.sh
bin/main share/FilesList/my_mm/FL_DATA_MM_MUON.txt share/JobOptions/datamc_qcd/JO_MUON_Data_MM_etadeponly.txt share/Trigger/RunTriggerList_MuonData2011.txt
mv Datamu_MM_MUON_OutputHisto.root ../output/trunk/MUON/Eta_Datamu_MM_OutputHisto.root 

Then see README files in AnaTools/Macros/EffDependencies to

  • use the macro fitEff.cxx to fit the leading jet p_T and min DR dependence
  • use the macro averageEff.cxx to get average values of dependencies
  • modify the functions calc_MM_weight_tagged and calc_MM_weight_untagged in the [trunk]/src/Variator.cxx

Now you can recompile the Histogrammer and run the Data_MM to get the QCD sample like e.g. in my script [trunk]/run_qcd_datamc_muon.sh

bin/main share/FilesList/my_mm/FL_DATA_MM_MUON.txt share/JobOptions/datamc_qcd/JO_MUON_Data_MM.txt share/Trigger/RunTriggerList_MuonData2011.txt

old MM

For predicting the QCD via the Matrix Method you need the Histogrammer package and Iterator one. Just check out all AnaTools.
  • Go to Histogrammer and do source setup.sh
  • Run run_MUON.sh
  • Go to Iterator and do make && ./iterations. You will get the k0 value and the MM_effs_muon_real.root and MM_effs_muon_fake.root for the Histogrammer input. Also the same for MC.
  • Once you have these efficiencies in the Histogrammer (/share/EffMM/). You can repeat the procedure for tagged case. Just editing top of the iterations.cxx file.
  • Now you can run doPlots_MUON.sh to do all kind of plots you want.
  • Editing iterations.cxx in the main function you can do plots with already the iterations procedure for diferents variables. And also fitting them!

  • run_MUON.sh :
  • bin/main share/FilesList/FL_DATA_MUON.txt share/JobOptions/JO_MUON_Data.txt share/Trigger/RunTriggerList_MuonData.txt
  • bin/main share/FilesList/FL_DATA_MM_MUON.txt share/JobOptions/JO_MUON_Data.txt share/Trigger/RunTriggerList_MuonData.txt
  • bin/main share/FilesList/FL_MC_MUON.txt share/JobOptions/JO_MUON_MC.txt share/Trigger/RunTriggerList_MuonMC.txt
  • doplots_MUONS.sh
  • bin/main share/FilesList/FL_DATA_MUON.txt share/JobOptions/DO_JO_MUON_Data.txt share/Trigger/RunTriggerList_MuonData.txt
  • bin/main share/FilesList/FL_DATA_MM_MUON.txt share/JobOptions/DO_JO_MUON_DataMM.txt share/Trigger/RunTriggerList_MuonData.txt
  • bin/main share/FilesList/FL_MC_MUON.txt share/JobOptions/DO_JO_MUON_MC.txt share/Trigger/RunTriggerList_MuonMC.txt

Slimming of D3PDs Temporary by Antonella

Slimming scripts by Clement are available in

  • /nfs/at3users/users/succurro/06QCD_MM/AnaTools/Macros/D3PDBatchSkimming_my

offSlimmer.C and mySlimmer.C

offSlimmer is the official slimming macro, in tauSlimmer I added the branches requested by Tau people, copy the one you want to use to Slimmer.C

InputFullFileList2011Data.txt and InputFullFileList2011MC.txt

These text files contains the full list of single D3PD files we have on Tier2, should not modified except by one person in charge of keeping it update (Jordi did up to now)

InputDatasetListData2011.py and InputDatasetListMC2011.py

These python files contain the definition of an array of dataset, comment what you don't want to be processed, decomment what you want to slim. Since we slim period by period please be sure you slim all runs of a period! Period definition is contained in the PrepareLists.py file

PrepareLists.py

Run this script to get several text files containing the list of files to be slimmed and merged into a single file. Then move FL_data* files into the folder share/FL_Data2011 and the MC ones in the equal one. Also make sure the pnfs and nfs paths are correct

FileToSend_Data2011.txt and FileToSend_MC2011.txt

Write in these files the list of FL_blabla.txt lists you want to process

SendJobs.sh

Script to send the jobs, you have to comment/decomment accordingly if you want to slim data or MC the following lines at the beginning:
  • #ListDir=$PWD"/share/FL_Data2011/"
  • ListDir=$PWD"/share/FL_MC2011/"
and at the end Also make sure the pnfs and nfs paths are correct

launch the jobs

Do source SendJobs.sh to launch the jobs, qstat | grep YOURNAME to monitor them

Grid

GangaTips

The following issues and tips have been sorted out during 2010 Top production:

  • Most of the sites don't have anymore slc4 kits. Before sending the job to the Grid, setup cmt with tag=slc5,gcc43,32,...
  • Ganga doesn't digest package structure with_version_directory. Use only package structure without_version_directory
  • AODs in mc09 have bigger size than mc08 (10000 events instead of 250). Some sites, like PIC, cannot process 1 AOD file in the short queue. Use wallclock requirements to force a longer queue.
  • Panda backend is confused by the prepare_old() command. You must use the prepare() command instead.
  • If you use the prepare() command, you will need to compile FLIPA beforehand, because the command is running some local tests before sending the job to the grid.
  • In case that you use j.application.prepare_old(athena_compile=True) should do make clean before sending you job
  • In case that you use TopInputs you have to use prepare!!
  • Check that the release you want to use is available on the site either by looking it up at the panda monitor side or by
    lcg-infosites --vo atlas <ce> <tag>
    .

Useful commands for ganga

  • jobs: print all jobs you have in ganga at the moment
  • jobs(3): print details of job with ID 3
  • jobs(3).ouputdata: print the details of the outputdata of job with ID3
  • jobs(3).outputdata.datasetname: print one item of the details
  • for j in jobs.select(5,23): first line of a for loop over the jobs (j is a job object as jobs(3) would be) with ID 5 up to 25. If a job ID between 5 and 23 does not exist, it will be ignored
  • jobs(3).remove(): remove the job and all its output files (job will remain available in DQ2).
  • jobs(3).subjobs.select(status='failed').resubmit(): re-submit subjobs of jobs with ID 3 if their status is failed
  • for j in jobs.select(status='failed'):
      j.inputdata.dataset
    
    print the dataset of the jobs that failed

  • help(jobs): a bunch of methods and stuff you can do with jobs

DQ2 howto

DQ2ClientsHowTo

Useful scripts

gangalf
A script to send ganga jobs: gangalfdata.py.
gangalfdata.py.txt. Just add at the beginning the file names you want to send, choose if it is a standard container file (with "/" ant the end) or not and select site/cloud you want to send the jobs to and if it is data or not Then enter ganga (source /afs/cern.ch/sw/ganga/install/etc/setup-atlas.sh && ganga) end execute gangalf: execfile('../share/gangalfdata.py')

checkJobs.py
This is a module that basically manages the resubmission of your jobs, in case they failed, an keeps track of the number of resubmission. To use it do the following

import sys
sys.path.append("/path/to/your/working/area/with/15.6.10.4/IFAEanalysis/FLIPA/share")
import checkJobs as cj
cj.check(jobs)
I'll think about a nice way such that you don't have to import the path always. Easiest solution: add it to your $PYTHONPATH in your .bashrc. However, since it will change, it's not the best solution.
The first time you use it you have to execute it as ch.check(jobs,1) which create as data file. Please specify the path for this file first in checkJobs.py

getJobsFromDQ2.sh
This is a short batch macro to download the files with dq2 into the directory, where you run the script ($ ./getJobsFromDQ2.sh). You have to specify in the file the list of files you want to download.
You can the get the file from here: getJobsFromDQ2.sh
Please, rename it with you name or something, such that more than one can use it in the production directory!

merge2nTuples.py
This python script merges all files that are specified in the file list inside of the file and moves them to the TARGETDIR, also specified inside. It asks at the beginng if you are happy with the renaming, which is basically trying to remove you user name and the time tag at the end of the file, coming from gangalf.
You can the get the file from here: merge2nTuples.py
Please, rename it with you name or something, such that more than one can use it in the production directory!
NOTE: You might have to change the STARTCUT variable from mc to mc09 or data, depeding on the naming of your files.
In case that you sent a job to a cloud, the output might come in different directories. This will be taken into account, if you remove the site tag (e.g. _time_20100705_130948.PIC_MCDISK) at the end of the file name and only put the file name once.

 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback