Four Tops Analysis - Logbook

For references and research about Four Tops, please follow this link.

Disclaimer: this twiki page does not store any official study, it is mostly for my records. For official CMS twiki pages, please visit: https://twiki.cern.ch/twiki/bin/viewauth/CMS/TWikiFourTops. If you found something incomplete or something hard to understand, please contact me: alejandro.gomez@cernNOSPAMPLEASE.ch

2012-02-13 Add more samples In progress

Sample Events (Ntuple) Cross Section (pb) in MG5 MCScaleFactor (xs/nevents)
fourtop1100 4041 0.12830E-03 3.1749566938876515e-08
fourtop1000_v11 3977 0.38029E-03 9.562232838823233e-08
fourtop900 3949 0.11484E-02 2.9080779944289691e-07
fourtop700 3757 0.12131E-01 3.2289060420548308e-06
fourtop500_v5 3368 0.18182 5.3984560570071261e-05
fourtop400   0.89550  

2012-01-27 - New TMVA Variables, CutFlow, first statistics with theta

  1. After the meeting, Markus Siedel (Desy) suggested some new variables in our MVA Analysis and to get rid of some others. We introduced the number of bjets using combined secondary vertex with different operating points:
    1. Number of bjets with CSV Loose
    2. Number of bjets with CSV Medium
    3. Number of bjets with CSV Tight
    4. New versions of Analyzer.C and StoreTreeVariable.h
  2. After training, these are some interested plots to define our procedure:
    1. Correlated Matrix plot for all the variables for Gh 1 TeV sample: background and signal.
    2. Input variables 1 and 2
  3. After looking at the previous plots, we decided to use only the following variables for TMVA training: Ht, Stlep, Stjet, number of jets and number bjets with CSVL.
    1. New version of myTMVAClassification.C
  4. With new C class, I wrote an improved version of AnalyzerAfterTraining.C. This version only runs the BDT response step.
    1. BDT response for Gh 0.5 TeV
    2. BDT response for Gh 1 TeV
  5. Cutflow for our analysis:
Sample Skim One Iso muon Loose muon veto Electron veto jets $\geq$ 4 Ht > 300 btags $\geq$ 1
$t\overline{t}$ 268.48 $\pm$ 0.36 180.62 $\pm$ 0.29 174.98 $\pm$ 0.29 161.61 $\pm$ 0.28 22.0 $\pm$ 0.1 20.3 $\pm$ 0.1 17.37 $\pm$ 0.09
$W\rightarrow l\nu$ 47997.94 $\pm$ 14.6 24994.34 $\pm$ 10.53 24994.28 $\pm$ 10.53 24991.13 $\pm$ 10.53 9.96 $\pm$ 0.21 9.15 $\pm$ 0.2 1.56 $\pm$ 0.08
$Z\rightarrow l^{+}l^{-}$ 7335.0 $\pm$ 2.6 2989.24 $\pm$ 1.66 1729.78 $\pm$ 1.26 1726.95 $\pm$ 1.26 0.95 $\pm$ 0.03 0.89 $\pm$ 0.03 0.15 $\pm$ 0.01
QCD 5311.16 $\pm$ 14.0 204.11 $\pm$ 2.74 203.84 $\pm$ 2.74 203.13 $\pm$ 2.74 0.05 $\pm$ 0.04 0.0 $\pm$ 0.0 0.0 $\pm$ 0.0
top t-ch 30.35 $\pm$ 0.06 19.56 $\pm$ 0.05 19.56 $\pm$ 0.05 19.54 $\pm$ 0.05 0.16 $\pm$ 0.0 0.15 $\pm$ 0.0 0.13 $\pm$ 0.0
anti-top t-ch 17.24 $\pm$ 0.05 11.24 $\pm$ 0.04 11.23 $\pm$ 0.04 11.22 $\pm$ 0.04 0.09 $\pm$ 0.0 0.08 $\pm$ 0.0 0.07 $\pm$ 0.0
top tW-ch 12.74 $\pm$ 0.04 9.04 $\pm$ 0.03 8.76 $\pm$ 0.03 8.11 $\pm$ 0.03 0.42 $\pm$ 0.01 0.4 $\pm$ 0.01 0.32 $\pm$ 0.01
anti-top tW-ch 12.79 $\pm$ 0.04 9.08 $\pm$ 0.03 8.81 $\pm$ 0.03 8.15 $\pm$ 0.03 0.44 $\pm$ 0.01 0.41 $\pm$ 0.01 0.33 $\pm$ 0.01
top s-ch 2.31 $\pm$ 0.02 1.47 $\pm$ 0.01 1.47 $\pm$ 0.01 1.47 $\pm$ 0.01 0.03 $\pm$ 0.0 0.03 $\pm$ 0.0 0.02 $\pm$ 0.0
anti-top s-ch 1.06 $\pm$ 0.01 0.69 $\pm$ 0.01 0.69 $\pm$ 0.01 0.69 $\pm$ 0.01 0.01 $\pm$ 0.0 0.01 $\pm$ 0.0 0.01 $\pm$ 0.0
WW 55.91 $\pm$ 0.08 36.99 $\pm$ 0.06 36.0 $\pm$ 0.06 33.47 $\pm$ 0.06 0.11 $\pm$ 0.0 0.1 $\pm$ 0.0 0.02 $\pm$ 0.0
WZ 17.07 $\pm$ 0.03 10.86 $\pm$ 0.02 9.36 $\pm$ 0.02 9.04 $\pm$ 0.02 0.04 $\pm$ 0.0 0.03 $\pm$ 0.0 0.01 $\pm$ 0.0
Total MC 61062.05 $\pm$ 26.16 28467.24 $\pm$ 17.37 27198.77 $\pm$ 17.18 27174.53 $\pm$ 16.78 34.25 $\pm$ 4.7 31.54 $\pm$ 4.51 19.98 $\pm$ 4.17
Gh 0.5 TeV 0.003224 $\pm$ 5.7e-05 0.001865 $\pm$ 4.3e-05 0.001734 $\pm$ 4.1e-05 0.001335 $\pm$ 3.6e-05 0.001192 $\pm$ 3.4e-05 0.00119 $\pm$ 3.4e-05 0.00113 $\pm$ 3.3e-05
Gh 1.0 TeV 0.000125 $\pm$ 2e-06 6.4e-05 $\pm$ 1e-06 6.1e-05 $\pm$ 1e-06 4.6e-05 $\pm$ 1e-06 4.5e-05 $\pm$ 1e-06 4.5e-05 $\pm$ 1e-06 4.2e-05 $\pm$ 1e-06
Data 31624219 12488737 11894824 11884126 14142 13018 8141
  1. Then, to calculate the bayesian limits for our Gh samples:
    1. Inside /uscms/home/algomez/work/CMSSW_4_2_4/src/
    2. Download and compile theta package
      1. svn co https://ekptrac.physik.uni-karlsruhe.de/public/theta/tags/testing theta
      2. cd theta
      3. make
      4. bin/theta examples/gaussoverflat.cfg
    3. Create a folder called fourtop/ located in /uscms/home/algomez/work/CMSSW_4_2_4/src/theta/utils/examples
    4. Modified the sample file called analysis.py
      • To run this script, first one needs to create a root file with the specific information and with the specific names. Afortunately, cuy script creates this root file.
        1. BDTresponse_cuy.xml - xml file to setup cuy.
        2. Inside /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/, where the corresponding root files should be, run this file as ../../cuy/scripts/cuy.py -b -x BDTresponse_cuy.xml -f "#splitline{CMS Preliminary}{4.7 fb^{-1} at #sqrt{s}=7TeV}" -o plots/ -p "png" -q -O templates_1000.root
    5. Once it rans, the package produce a html page with some statistical information: (both pages temporary available):
      1. For Gh 0.5 TeV
      2. For Gh 1.0 TeV

2012-01-20 - First Official Presentation

  1. First Meeting of Four Top Group at CMS. More details here: https://indico.cern.ch/conferenceDisplay.py?confId=172983
    1. My slides are here.

2012-01-19 - Implement training C class in our analyzer

  1. First complete version of MVA Training. Our variables are:
    1. B discriminant 1 leading jet
    2. B discriminant 2 leading jet
    3. B discriminant 3 leading jet
    4. B discriminant 4 leading jet
    5. Ht
    6. Stlep
    7. Stjet
    8. Number of jets
    9. Inside /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/TMVA/
    10. Run myTMVAClassification.C as: root -l myTMVAClassification.C
    11. After the previous step, called training, the package gave two files for each Gh sample: myTMVAClassification_BDT.class.C and myTMVAClassification_BDT.weights.xml stored in the folder weight/
  2. Once I got the C class file, I have to include it in my analyzer.
    1. Inside /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/
    2. New files AnalyzerAfterTraining.C and AnalyzerAfterTraining.h
    3. After ran it, I found a new optimized variable to discriminate between MC and signal:

2012-01-05 - TMVA and fixing b discriminat.

  1. Looking at results*.root files, something weird is going on with the b discriminant values. After discussing this with Francisco and look better at the code, I was storing an array of array of values, instead of a simple array of b discriminants. In addition, PUweights are included in the storing tree for the MVA analysis for this version.
    1. StoreTreeVariable.h
    2. Analyzer.C
  2. After talking with Fan and Nhan about TMVA, It is more clear how the package works and how to run it.
    1. myTMVAClassification.C this file includes PUweights for background and runs only for ttbar and signal for gh=1 TeV.
    2. Inside your working directory (where myTMVAClassification.C is located): root -l myTMVAClassification.C.
    3. Lots of files and plots are temporary located in /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/TMVA/plots/ and /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/TMVA/weights/

2012-01-03 - BDT and Happy New Year smile

  1. Adding the variable $S_t^{lep} = \sum MET + p^{\mu}_t $ and $S_t^{jet} = \sum MET + p^{\mu}_t + H_t $ in the code.
    1. Stlep
    2. Stjet
  2. Boosted Desicion Tree (BDT): To optimized our cuts, we are going to use BDT with b discriminants of the first, second, third leading jets.
    1. First I need to store the information of b discriminant of each jet in a tree and then such tree must be added to the Ntuple. I did this because, in the former Ntuple, b discriminants are not separed for each jet. Instead they are store in a vector.
      1. Created a header file, called StoreTreeVariable.h, in /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/interface/, where it is defined the tree and its variables.
      2. Modified the Analizer.C to include the header and store b discriminants.
    2. Once I got the root file with histograms and the tree with b discriminants, I modified an script to run MVA analysis.
      1. First raw test: TMVAClassification.C
      2. IMPORTANT To run TMVA in a cmslpc machine, with the former code, one should download TMVA package from here. It is not necesarily important to install such package (as some links over the internet said), to run appropiately, some files must be in the working directory, like: TMVAGui.C or tmvaglob.C. One can copy those files from the download package, in the test/ folder.
        • For a better performance: I created another folder in my working directory, called TMVA, and then I copy everything from the downloading package test/ and put them in my TMVA folder. Now, it works perfectly.
      3. It seems like it runs correctly. I was only checking the machinery.

2011-12-21 - B Discriminant

  1. To optimize our cuts, we tried to superimpose the total b discriminant normalized for each sample. We use discriminant for CSV for first and second leading jet. Once these plot has been taken from results_root file, we normalized them with: Scale(1/h.Integral()). Then, they are multiplied to combined them in a total b discriminant plot. For this, I made a python script.
    1. bdiscriminator python macro. Very raw macro to plot this.
    2. b discriminator for leading jet
    3. b discriminator for 2nd jet
    4. total b discriminator

2011-12-19 - First b-tagging

  1. Add Combined Secondary Vertex Medium (CSVM) B tagging in the code. This version includes:
    1. Primary vertex, Ht and Mt after cuts.
    2. New cuts: jet.pt > 40; jets[0].pt > 100, jets[1].pt > 60 + previous cuts Njets > 2 and Ht > 300
    3. Plots:
      • Primary vertex
      1. Primary vertex
      2. Primary vertex with logy
      • Ht, MET, Mt
      1. Ht
      2. MET
      3. Mt
      • Leptons
      1. Number of Muons
      2. DeltaR(muon, jet)
      3. Number of loose isolated electrons
      • Jets
      1. Number of jets
      2. Number of b-jets
      3. Number of jets with at least one b-jet
      4. Leading jet eta
      5. Leading jet pt
      6. Second Leading jet eta
      7. Second Leading jet pt
      8. 3rd jet pt
      9. 4th jet pt
      10. 5th jet pt
      11. 6th jet pt
      12. 7th jet pt
      13. DeltaR(jet, jet)

2011-12-15 - First Cuts

  1. MCScaleFactors for 4tops samples:
    Sample Events (Ntuple) Cross Section (pb) in MG5 MCScaleFactor (xs/nevents)
    fourtop1000_v11 3977 .98699E-05 2.481745033945185E-09
    fourtop500_v5 3368 .25442E-03 7.5540380047505938E-08
  2. It seems like I was not filling some plots in the right place. I mean, some of the previous plots were filled before some important filters runs. That is the reason I change some parts in the code.
    • Preliminar cuts: muon pt > 35 muon_iso < 0.2, jet pt > 35 GeV
    1. Analyzer.C
    2. Primary Vertex
      1. Primary vertices
      2. Primary vertices with Logy
    3. Leptons
      1. Number of muons
      2. Muon Relative Isolation
      3. deltaR(muon, closest jet)
      4. Number of loose isolated electrons
    4. Jets
      1. Number of jets > 0
      2. Number of jets > 2
      3. Leading jet eta
      4. Leading jet pt
      5. Second leading jet eta
      6. Second leading jet pt
      7. Third jet pt
      8. Fourth jet pt
      9. Fifth jet pt
      10. Sixth jet pt
      11. Seventh jet pt
      12. Minimum deltaR(jet,jet)
    5. MET, Ht, Mt
      1. Ht from Ntuple.
      2. MET
      3. W Mass transversal
  3. The code above includes the first cut in the analysis. After looking at this plot, our suggestions are that Ht could be a good variable to take into account.
    • Ht = \sum jet pt
    • First cut Ht > 300 GeV.
    1. Leptons
      1. Number of muons
      2. deltaR(muon, closest jet)
      3. Number of loose isolated electrons
    2. Jets
      1. Number of jets
      2. Leading jet eta
      3. Leading jet pt
      4. Second leading jet eta
      5. Second leading jet pt
      6. Third jet pt
      7. Fourth jet pt
      8. Fifth jet pt
      9. Sixth jet pt
      10. Seventh jet pt
      11. Minimum deltaR(jet,jet)
    3. MET, Ht, Mt
      1. Ht
      2. MET

2011-12-12 - Superimposed Plots

  1. 4tops samples after runAnalysis fixed. The problem was that as I ran FastSimulation, my NTuple does not record some values to calculate the correct PUweights. Adding this condition (if 4top sample then PUweight=1), root file looks fine.
  2. Superimposed plots for 4top samples and background.
    1. Complete 4Top_cuy.xml
    2. Number of Muons
    3. Muons:
    4. Number of jets
    5. Jets:
    6. Missing Transverse Energy:
      • MET
      • MET with Ylog
      • MET with Njets>=5
      • MET with Njets>=5 and Ylog
    7. Ht and Mt
      • Ht
      • Ht with Ylog
      • Mt
      • Mt with Ylog
      • Mt with Njets>=5
      • Mt with Njets>=5 and Ylog

2011-12-09 - Plots

  1. In the meantime, I started to create supperimpose plots for my background.
    1. In /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/
    2. Create an xml file called 4top_cuy.xml.
    3. Results:
  2. In addition to the previous plots, I perform new plots in Analizer.C:
      • Number of isolated muons with loose isolation (<0.2) and Njets>=5
      • Isolation of leading muon (in pT) and Njets>=5
      • Isolation of second muon and Njets >=5
      • Number of isolated muons with one tight isolation <0.1 and rest with loose iso, and Njets>=5
      • Charge of muons with one tight isolation <0.1 and rest with loose iso, and Njets>=5
      • MET with Njets >=5
      • transverse mass (MT) with Njets>=5

2011-12-08 - Analyzer and Cuy.

  1. Few changes in Analyzer.C and runAnalyzer.C, then execute them
    1. In /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/
    2. Code modified:
    3. root -l loadlibraries.C -b
    4. .x .runAnalysis.C("MC",1)
      • Several root files like results_ttbar.root or results_4Top_1000.root. But looking at results_4Top_1000.root, some plots are empty and it does not look to be right.
      • Messages during the process:
        algomez@lpcdt077:~/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test$ root -l loadLibraries.C -b
        
         Using style 'Plain'. For approved plots use: gROOT->SetStyle("CMS");
        
        root [0] 
        Processing loadLibraries.C...
        root [1] .x runAnalysis.C("MC",0)
         +++ Starting PROOF-Lite with 2 workers +++
        Opening connections to workers: OK (2 workers)                 
        Setting up worker servers: OK (2 workers)                 
        PROOF set to parallel mode (2 workers)
        13:49:09 31918 Wrk-0.0 | Info in <TProofServLite::Setup>: fWorkDir: /uscmst1b_scratch/lpc1/3DayLifetime/algomez/proof
        (int)0
        13:49:09 31920 Wrk-0.1 | Info in <TProofServLite::Setup>: fWorkDir: /uscmst1b_scratch/lpc1/3DayLifetime/algomez/proof
        (int)0
        (int)0
        (int)0
        13:49:10 31920 Wrk-0.1 | Error in <TFile::TFile>: file /query-result.root does not exist
        13:49:10 31920 Wrk-0.1 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
        13:49:10 31918 Wrk-0.0 | Error in <TFile::TFile>: file /query-result.root does not exist
        13:49:10 31918 Wrk-0.0 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
         
        Info in <TProofLite::SetQueryRunning>: starting query: 1
        Info in <TProofQueryResult::SetRunning>: nwrks: 2
        Info in <TUnixSystem::ACLiC>: creating shared library /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer_C.so
        In file included from /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.C:26,
                         from /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/Analyzer_C_ACLiC_dict.h:34,
                         from /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/Analyzer_C_ACLiC_dict.cxx:16:
        /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.h: In constructor 'Analyzer::Analyzer(TTree*)':
        /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.h:69: warning: 'Analyzer::fFile' will be initialized after
        /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.h:62: warning:   'TFile* Analyzer::fweightfile'
        /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.h:75: warning:   when initialized here
        /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.h:70: warning: 'Analyzer::fProofFile' will be initialized after
        /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.h:63: warning:   'TH1D* Analyzer::f3Dweight'
        /uscms/home/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/./Analyzer.h:75: warning:   when initialized here
        Info in <Analyzer::Begin>: Histogram names will have suffix: 4Top_500
        Info in <Analyzer::Begin>: This sample is treated as MC
        Info in <Analyzer::Begin>: we will use pileup set: WHist
        Info in <Analyzer::Begin>: starting with process option: sample=4Top_500
        Looking up for exact location of files: OK (1 files)                 
        Looking up for exact location of files: OK (1 files)                 
        Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 2
        Validating files: OK (1 files)                 
        Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
        [TProof::Progress] Total 3368 events    |====================| 100.00 % [12382.4 evts/s]
        Output file: results_4Top_500.root | (2 workers still sending)   
        Output file: results_4Top_500.root \ (1 workers still sending)   
        Warning in <TClass::TClass>: no dictionary for class pair<string,TH1*> is available
        Info in <Analyzer::Terminate>: Analyzer done.
        Lite-0: all output objects have been merged                                                         
         
        Info in <TProofLite::SetQueryRunning>: starting query: 2
        Info in <TProofQueryResult::SetRunning>: nwrks: 2
        Info in <ACLiC>: unmodified script has already been compiled and loaded
        Info in <Analyzer::Begin>: Histogram names will have suffix: ttbar
        Info in <Analyzer::Begin>: This sample is treated as MC
        Info in <Analyzer::Begin>: we will use pileup set: WHist
        Info in <Analyzer::Begin>: starting with process option: sample=ttbar
        Looking up for exact location of files: OK (1 files)                 
        Looking up for exact location of files: OK (1 files)                 
        Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 2
        Validating files: OK (1 files)                 
        Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
        Output file: results_ttbar.root... | (2 workers still sending)   .23 % [29189.7 evts/s]
        [TProof::Progress] Total 548611 events  |====================| 100.00 % [34499.5 evts/s]
        Output file: results_ttbar.root... \ (1 workers still sending)   
        Info in <Analyzer::Terminate>: Analyzer done.
        Lite-0: all output objects have been merged 
                 
  2. In paralell, I have tried some test with cuy (an easy way to supperimpose plots).
    1. In /uscms_data/d3/algomez/Analyzer/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/
    2. Modify muon_cuy.xml:
      <cuy>
      
        <validation type="Wjets" file="./results_WJets.root" weight="MCScaleFactors.txt:Wjets">
          <!-- muon -->
          <TH1 name="pt_WJets" source="/muons/muon_pt_WJets"/>
        </validation>
      
        <validation type="Zjets" file="./results_ZJets.root" weight="MCScaleFactors.txt:Zjets">
          <!-- muon -->
          <TH1 name="pt_ZJets" source="/muons/muon_pt_ZJets"/>
        </validation>
      
        <superimpose name="muon_pt" title="muon_pt" YTitle="Events" Fill="true" Weight="true" Lumi="3558.0" Stack="true" SubBanner="#mu+jets N_{jets}#geq0" PlotDiff="false">
          <superimposeItem name="pt_WJets" color="top" legend="Wjets"/>
          <superimposeItem name="pt_ZJets" color="top" legend="Zjets"/>
        </superimpose>
      </cuy>
               
    3. ../../cuy/scripts/cuy.py -b -x muon_cuy.xml -f "#splitline{CMS Preliminary}{2.18 fb^{-1} at #sqrt{s}=7TeV}" -o plots/ -p "png"
    4. Results: Simple muon pt with Wjets and Zjets.
    5. Change some information in mass_cuy.xml, another plots with more samples: MET, MET_fin, MET_2jet.
    • Files in B would be temporary available.

2011-12-07 - PatTuples, Ntuples for both samples.

  1. As I only had a patTuple of the 1000 GeV GH sample, I create the patTuples for both samples. (just to check if I did the right thing and to learn better what have I done).
    1. Erase previous folders and start again from zero.
    2. In /uscms/home/algomez/work/
    3. Open a new file with info about the packages installed, such file looks like this (according to V9 of https://twiki.cern.ch/twiki/bin/viewauth/CMS/TopLikeBSMSpring2011) :
      cmsrel CMSSW_4_2_4
      cd CMSSW_4_2_4/src
      cmsenv
      addpkg RecoJets/Configuration   V02-04-17
      addpkg RecoVertex/PrimaryVertexProducer                 V01-04-10
      addpkg PhysicsTools/HepMCCandAlgos                      V11-03-17
      addpkg TopQuarkAnalysis/TopPairBSM                      tlbsm_42x_v9_006
      addpkg PhysicsTools/PatAlgos                            V08-06-29-00 
      scram b -j 4
               
    4. cd TopQuarkAnalysis/TopPairBSM/test
    5. cp ttbsm_cfg.py 4tbsm_cfg.py
    6. In 4tbsm_cfg.py change input files:
      if not options.useData :
          inputJetCorrLabel = ('AK5PFchs', ['L1FastJet', 'L2Relative', 'L3Absolute'])
      
          if options.use41x:
              process.source.fileNames = [
              'file:/uscms_data/d3/algomez/files/fourtop/fourtop1000_v11.root'
              ##'file:/uscms_data/d3/algomez/files/fourtop/fourtop500_v5.root'
                  ]
          else :
              process.source.fileNames = [
              'file:/uscms_data/d3/algomez/files/fourtop/fourtop1000_v11.root'
              ##'file:/uscms_data/d3/algomez/files/fourtop/fourtop500_v5.root'
                  ]
      #        process.source.eventsToProcess = cms.untracked.VEventRange('1:9375817')
          
      else :
          if options.use41x :
              inputJetCorrLabel = ('AK5PFchs', ['L1FastJet', 'L2Relative', 'L3Absolute', 'L2L3Residual'])
              process.source.fileNames = [
              'file:/uscms_data/d3/algomez/files/fourtop/fourtop1000_v11.root'
              ##'file:/uscms_data/d3/algomez/files/fourtop/fourtop500_v5.root'
                  ]
          else :
              inputJetCorrLabel = ('AK5PFchs', ['L1FastJet', 'L2Relative', 'L3Absolute', 'L2L3Residual'])
              process.source.fileNames = [
              'file:/uscms_data/d3/algomez/files/fourtop/fourtop1000_v11.root'
              ##'file:/uscms_data/d3/algomez/files/fourtop/fourtop500_v5.root'
                  ]
               
      Here is the complete file.
      • Trying to run as: cmsRun 4tbsm_cfg.py useData=0 &> 4tbsm_cfg.log but this error appears:
        Begin processing the 1st record. Run 1, Event 1, LumiSection 1 at 07-Dec-2011 10:18:49.163 CST
        07-Dec-2011 10:18:49 CST  Closed file file:/uscms_data/d3/algomez/files/fourtop/fourtop1000_v11.root
        %MSG-s CMSException:  AfterFile 07-Dec-2011 10:18:49 CST PostEndRun
        cms::Exception caught in cmsRun
        ---- ProductNotFound BEGIN
         could not find HcalNoiseSummary.
        cms::Exception going through module HBHENoiseFilter/HBHENoiseFilter run: 1 lumi: 1 event: 1
        If you wish to continue processing events after a ProductNotFound exception,
        add "SkipEvent = cms.untracked.vstring('ProductNotFound')" to the "options" PSet in the configuration.
        ProcessingStopped
        Exception going through path p0
        EventProcessingStopped
        an exception occurred during current event processing
        cms::Exception caught in EventProcessor and rethrown
        ---- ProductNotFound END
        
        
        %MSG
                 
    7. Searching a solution, I found that this python file creates a new config file and only after commenting the following lines:
      #process.HBHENoiseFilter = cms.EDFilter("HBHENoiseFilter",
      #    minRBXHits = cms.int32(999),
      #    minIsolatedNoiseSumE = cms.double(999999.0),
      #    minHighEHitTime = cms.double(-9999.0),
      #    minHPDNoOtherHits = cms.int32(10),
      #    useTS4TS5 = cms.bool(True),
      #    minZeros = cms.int32(10),
      #    minNumIsolatedNoiseChannels = cms.int32(999999),
      #    maxRatio = cms.double(999.0),
      #    maxHighEHitTime = cms.double(9999.0),
      #    maxRBXEMF = cms.double(-999.0),
      #    minHPDHits = cms.int32(17),
      #    minIsolatedNoiseSumEt = cms.double(999999.0),
      #    minRatio = cms.double(-999.0)
      #)
               
      and removing this process HBHENoiseFilter here:
              process.patseq = cms.Sequence(process.scrapingVeto+process.HBHENoiseFilter+process.goodOfflinePrimaryVertices+process.eidCiCSequence+process.primaryVertexFilter+process.patPF2PATSequencePFlow+process.looseLeptonSequence+process.patDefaultSequence+process.goodPatJetsPFlow+process.goodPatJetsCA8PF+process.goodPatJetsCA8PrunedPF+process.goodPatJetsCATopTagPF)=          
               
      the file runs correctly.
    8. With this correction, I ran both samples and got 4tbsm_1000_42x_mc.root and 4tbsm_500_42x_mc.root in my area: /uscms_data/d3/algomez/files/fourtop/.
  2. Tuples from PAT:
    1. In /uscms_data/d3/algomez/work/CMSSW_4_2_4/src/Yumiceva/Top7TeV/test
    2. In TuplesFromPAT.py file, comment former data/mc sources and add file:/uscms_data/d3/algomez/files/fourtop/4tbsm_1000_42x_mc.root and file:/uscms_data/d3/algomez/files/fourtop/4tbsm_500_42x_mc.root. Change eventtype to 4top and increase number of events to 10k.
    3. cmsRun TuplesFromPAT.py useData=0 channel=muon events=10000 >& 4Top_1000.log &
    4. Results: 4top_1000_Tuple-PATskim.root and 4top_500_Tuple-PATskim.root located in /uscms_data/d3/algomez/files/fourtop/

2011-11-22 - Slides and BR Pie Chart

  1. First presentation at LPC!! smile
    1. Search for Heavy Scalar Bosons decaying in Four Tops at LHC
  2. There was something missing for the Four Top BR. This plot shows the BR including taus for the 1 TeV sample.
    1. This pie chart shows the BR for the general cases.
    2. This pie chart shows more specific BR, where lepton (in some specific places) means electron or/and muon.
    3. A complete BR table is here:
      Main Specific BR (%)
      All Jets All Jets 20.88
      Leptonic   0.04
        mu 0.01
        e 0.01
        tau 0.01
      Lepton + jets   40.04
        mu 13.35
        e 13.35
        tau 13.35
        (only mu/e) 26.69
      Dilepton + jets   28.78
        mu mu 3.20
        e e 3.20
        tau tau 3.20
        mu e 6.40
        mu tau 6.40
        e tau 6.40
        (only mu and/or e) 12.79
      Trilepton + Jets   9.20
        mu mu mu 0.34
        e e e 0.34
        tau tau tau 0.34
        mu mu e 1.02
        mu mu tau 1.02
        mu e e 1.02
        e e tau 1.02
        mu tau tau 1.02
        e tau tau 1.02
        mu e tau 2.04
        (only mu and/or e) 2.73

2011-10-25 - Cross Sections and Feynman Diagrams

  1. Important Information (reported by MG5):
    Sample Events Cross Section (pb)
    fourtop1000_v11 10000 .98699E-05
    fourtop500_v5 10000 .25442E-03
  2. A set of Feynman Diagrams for my FourTop and Background processes have been made.
    1. Create the diagrams with JaxoDraw in my laptop.
    2. After exported a tex file for each diagram:
      1. latex file.tex
      2. dvips file.dvi
      3. ps2pdf file.ps
      4. Results: pdf file and tex file

2011-10-20 - Simple First Analyzer

  1. In order to familiarize with the Analyzer tool, I ran the Analyzer code given by Francisco for his W' -> tb analysis.
    1. Based on the Franscisco's Analyzer (V8): (Inside /uscms_data/d3/algomez/work/)
      1. cmsrel CMSSW_4_2_4
      2. cd CMSSW_4_2_4/src
      3. cmsenv
      4. addpkg RecoJets/Configuration V02-04-17
      5. addpkg RecoVertex/PrimaryVertexProducer V01-04-10
      6. addpkg TopQuarkAnalysis/TopPairBSM tlbsm_42x_v8_005
      7. scram b -j 4=
    2. Test that everything works:
      1. cd TopQuarkAnalysis/TopPairBSM/test
      2. cmsRun ttbsm_cfg.py useData=1
        • Everything seems fine. Several results_* root files created.
  2. First PATuple for 4top 1k sample:
    1. In /uscms_data/d3/algomez/work/CMSSW_4_2_4/src/TopQuarkAnalysis/TopPairBSM/test/
    2. cp  ttbsm_cfg.py 4tbsm_cfg.py
    3. In 4tbsm_cfg.py: comment inputs and add 'file:/uscms_data/d3/algomez/files/fourtop/fourtop1000_v11.root'.
    4. cmsRun 4tbsm_cfg.py >& 4tbsm_cfg.runlog
    5. As seems fine, Job sent to condor.
      • Result: 4tbsm_42x_mc.root file located in /uscms_data/d3/algomez/files/fourtop/
  3. First tuples from PAT:
    1. In /uscms_data/d3/algomez/work/CMSSW_4_2_4/src/Yumiceva/Top7TeV/test
    2. In TuplesFromPAT.py file, remove previous data/mc source and add file:/uscms_data/d3/algomez/files/fourtop/4tbsm_42x_mc.root. Change eventtype to 4top
    3. cmsRun TuplesFromPAT.py useData=0 channel=muon events=10000 >& TuplesFromPAT.log &
    4. Results: 4top-Tuple-PATskim.root located in /uscms_data/d3/algomez/files/fourtop/
  4. First nice histograms:
    1. In /uscms_data/d3/algomez/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test
    2. in runAnalysis.C comment all samples and add:
        if (NoGUI) p->SetBit(TProof::kUsingSessionGui);
        if (sample=="MC"||sample=="4top"||sample=="all")
          {
            TDSet *mc_4top_1000 = new TDSet("top","*","/PATNtupleMaker");
            mc_4top_1000->Add("/uscms/home/algomez/nobackup/files/fourtop/4top-Tuple-PATskim.root");
            mc_4top_1000->Process("Analyzer.C+","sample=4top_1000");
          }
               
    3. root -l loadLibraries.C
    4. .x runAnalysis.C("MC",1)
    5. Results: results_4top_1000.root file located in /uscms_data/d3/algomez/files/fourtop/

2011-10-16 - Writing a Project Approval for my Home University

  1. Following the paperwork, this is my project approval:
    1. Project Approval in Spanish.

2011-10-10 - Writing a Project Description for my Home University

  1. In order to offilialize this current research, I have to do some paperwork for my home university (EPN).
    1. Project Description in Spanish.

2011-10-09 - New MG5 sample for 1TeV gh

  1. Generate new sample of fourtop in MG5 with gh decaying only to ttbar. Sent it to condor for 10k events.
    1. fourtop1000_v11_proc_card_mg5.dat

2011-10-08 - Kinematics plots problem

  1. After a better look of my code, I realized that former plots of pt, eta and deltaR were wrong. In my previous code, if I count 500 events with only one lepton in final state, it didn't analyse those events. Instead, it analyse the first 500 events in the lhe file.
  2. New version of my code with such error fix: fourtopAnalyzer.py
  3. New Plots for gh of 500 GeV:
    1. gh kinematics:
      1. eta of gh
      2. mass of gh
      3. pt of gh
      4. number of gh in each event
    2. top kinematics:
      1. eta of top
      2. mass of top
      3. pt of top
      4. number of top in each event
    3. BR and Acceptance Eficiency:
      1. Branching Ratio
      2. Acceptance Efficiency
    4. After detector cuts:
      1. All jets
        1. deltaR of all jets
        2. pt of all jets
        3. eta of all jets
      2. Lepton+jets:
        1. deltaR for lepton+jets
        2. eta of leptons
        3. pt of leptons
        4. eta of partons
        5. pt of partons
      3. Dilepton+jets:
        1. deltaR for dilepton+jets
        2. eta of leptons
        3. pt of leptons
        4. eta of partons
        5. pt of partons
      4. 3 lepton+jets:
        1. deltaR for 3 lepton+jets
        2. eta of leptons
        3. pt of leptons
        4. eta of partons
        5. pt of partons
      5. 4 lepton+jets:
        1. deltaR for 4 lepton+jets
        2. eta of leptons
        3. pt of leptons
        4. eta of partons
        5. pt of partons

2011-10-06 - New Generation Samples in MG5

  1. Generate new sample of fourtop in MG5 with gh decaying only to ttbar. Sent it to condor for 10k events.
    1. fourtop500_v5_proc_card_mg5.dat
  2. Modified gh id number in lhe file: fourtop500_v5.lhe
    1. fourtop500_v5.runlog with Event Listing
  3. Got correct BR:
    1. fourtop500_v5_BR.png

2011-10-04 - BR for new samples:

  1. After modifying my fourtopAnalyzer, I realized that BR doesn't seems to be correct. I did some changes in order to have the right BR but I only got this plot:
    1. fourtop1000_v10_BR.png Using the same condition as ttbar_v15_BR.png.

2011-10-03 - Verification plots

  1. Modified fourtopAnalyzer.py for Pythia8.
    • Pythia8 Particle Status:
      code range explanation
      11 19 beam particles
      21 29 particles of the hardest subprocess
      31 39 particles of subsequent subprocesses in multiple interactions
      41 49 particles produced by initial-state-showers
      51 59 particles produced by final-state-showers
      61 69 particles produced by beam-remnant treatment
      71 79 partons in preparation of hadronization process
      81 89 primary hadrons produced by hadronization process
      91 99 particles produced in decay process, or by Bose-Einstein effects
    • After a good look of some event listings, I realized for my process that: gh particles are 21; ttbar, W's and gluons are 22; and b's from ttbar and partons or leptons from W's are 23. So, here it is the same to request p.mother() = 24= (i.e., W's) for leptons/partons than request p.status() = 23=.
  2. GH propierties:
    1. 500 Mev:
      1. fourtop500_v4_gh_pt.png
      2. fourtop500_v4_gh_nparticles.png
      3. fourtop500_v4_gh_mass.png
      4. fourtop500_v4_gh_eta.png
    2. 1 Tev:
      1. fourtop1000_v10_gh_eta.png
      2. fourtop1000_v10_gh_mass.png
      3. fourtop1000_v10_gh_nparticles.png
      4. fourtop1000_v10_gh_pt.png
  3. 4top propierties:
    1. gh of 500 Mev:
      1. fourtop500_v4_top_pt.png
      2. fourtop500_v4_top_nparticles.png
      3. fourtop500_v4_top_mass.png
      4. fourtop500_v4_top_eta.png
    2. gh of 1 Tev:
      1. fourtop1000_v10_top_eta.png
      2. fourtop1000_v10_top_mass.png
      3. fourtop1000_v10_top_nparticles.png
      4. fourtop1000_v10_top_pt.png

2011-10-02 - Verification plots and first jobs with Ntuples

  1. Modified ttbarAnalyzer.py for Pythia8.
  2. Plots:
    1. ttbar:
      1. ttbar_v15_BR.png
      2. ttbar_v15_accepEfficiency.png
      3. ttbar_v15_deltaR.png
      4. ttbar_v15_top_eta.png
      5. ttbar_v15_top_mass.png
      6. ttbar_v15_top_nparticles.png
      7. ttbar_v15_top_pt.png
      8. ttbar_v15_acceptance.png
      9. ttbar_v15_alljets_eta.png
      10. ttbar_v15_alljets_pt.png
      11. ttbar_v15_leptonjets_lepton_pt.png
      12. ttbar_v15_leptonjets_lepton_eta.png
      13. ttbar_v15_leptonjets_parton_pt.png
      14. ttbar_v15_leptonjets_parton_eta.png
      15. ttbar_v15_leptonjets_deltaR.png
      16. ttbar_v15_dilepton_lepton_pt.png
      17. ttbar_v15_dilepton_lepton_eta.png
      18. ttbar_v15_dilepton_parton_pt.png
      19. ttbar_v15_dilepton_parton_eta.png
      20. ttbar_v15_dilepton_deltaR.png
  3. First steps with Analyzer and NTuples:
    1. Copy NTuple Prescription from Francisco's W'tb Analysis from: https://twiki.cern.ch/twiki/bin/view/CMS/ExoticaWptb
      1. In /uscms_data/d3/algomez/work/
      2. cmsrel CMSSW_4_2_4
      3. cd CMSSW_4_2_4/src
      4. cmsenv
      5. addpkg RecoJets/Configuration   V02-04-17
      6. addpkg RecoVertex/PrimaryVertexProducer                 V01-04-10
      7. addpkg TopQuarkAnalysis/TopPairBSM                      tlbsm_42x_v8_003
      8. addpkg CondFormats/JetMETObjects
      9. cvs co -r V00-02-07 -d Yumiceva/Top7TeV UserCode/Yumiceva/Top7TeV
      10. cvs co -d Yumiceva/TreeAnalyzer UserCode/Yumiceva/TreeAnalyzer
      11. scramv1 b -j4
      12. mkdir 4Top
      13. cp Yumiceva/* 4Top
    2. Check if TreeAnalyzer runs fine:
      1. cd /UserCode/Yumiceva/TreeAnalyzer/test
      2. Change runAnalysis.C with my info.
        void runAnalysis(TString sample="all",bool NoGUI=false)
        {
          TString desdir = "/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/src/Yumiceva/TreeAnalyzer/test/";
          TProof *p = TProof::Open("lite://", desdir ,desdir);
        
          //p->AddDynamicPath("");
          p->Exec("gSystem->Load(\"/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/lib/slc5_amd64_gcc434/libYumicevaTop7TeV.so\")");
          p->Exec("gSystem->Load(\"/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/lib/slc5_amd64_gcc434/libCondFormatsJetMETObjects.so\")");
          p->AddIncludePath("/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/src/");
        ....
                 
      3. root -l loadLibraries.C
      4. .x runAnalysis.C("MC",1) runAnalysis options ("A","B"): A could be "MC" or "data"; B could be 0 or 1 (no GUI).
        • Log file:
           +++ Starting PROOF-Lite with 8 workers +++
          Opening connections to workers: OK (8 workers)                 
          Setting up worker servers: OK (8 workers)                 
          PROOF set to parallel mode (8 workers)
          22:08:45  3695 Wrk-0.3 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          22:08:45  3699 Wrk-0.5 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          22:08:44  3689 Wrk-0.0 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          22:08:45  3697 Wrk-0.4 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          22:08:45  3701 Wrk-0.6 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          22:08:44  3691 Wrk-0.1 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          22:08:44  3693 Wrk-0.2 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          22:08:45  3703 Wrk-0.7 | Info in <TProofServLite::Setup>: fWorkDir: /uscms/home/algomez/.proof
          (int)0
          (int)0
          (int)0
          (int)0
          (int)0
          (int)0
          (int)0
          (int)0
          (int)0
          22:08:47  3695 Wrk-0.3 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:47  3695 Wrk-0.3 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
          22:08:47  3699 Wrk-0.5 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:47  3699 Wrk-0.5 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
          22:08:48  3689 Wrk-0.0 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:48  3689 Wrk-0.0 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
          22:08:48  3691 Wrk-0.1 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:48  3691 Wrk-0.1 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
          22:08:48  3697 Wrk-0.4 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:48  3697 Wrk-0.4 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
          22:08:48  3693 Wrk-0.2 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:48  3693 Wrk-0.2 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
          22:08:48  3701 Wrk-0.6 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:48  3701 Wrk-0.6 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
          22:08:48  3703 Wrk-0.7 | Error in <TFile::TFile>: file /query-result.root does not exist
          22:08:48  3703 Wrk-0.7 | Info in <TProofServLite::HandleArchive>: file cannot be open (/query-result.root)
           
          Info in <TProofLite::SetQueryRunning>: starting query: 1
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <Analyzer::Begin>: Histogram names will have suffix: Wprime_800
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=Wprime_800
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_Wprime_800.root (7 workers still sending)   
          Output file: results_Wprime_800.root (5 workers still sending)   
          Warning in <TClass::TClass>: no dictionary for class pair<string,TH1*> is available
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 2
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: Wprime_1000
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=Wprime_1000
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_Wprime_1000.root(8 workers still sending)   
          Output file: results_Wprime_1000.root(7 workers still sending)   
          Output file: results_Wprime_1000.root(6 workers still sending)   
          Output file: results_Wprime_1000.root(5 workers still sending)   
          Output file: results_Wprime_1000.root(4 workers still sending)   
          Output file: results_Wprime_1000.root(3 workers still sending)   
          Output file: results_Wprime_1000.root(2 workers still sending)   
          Output file: results_Wprime_1000.root(1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 3
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: Wprime_1200
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=Wprime_1200
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_Wprime_1200.root(7 workers still sending)   
          Output file: results_Wprime_1200.root(6 workers still sending)   
          Output file: results_Wprime_1200.root(4 workers still sending)   
          Output file: results_Wprime_1200.root(3 workers still sending)   
          Output file: results_Wprime_1200.root(2 workers still sending)   
          Output file: results_Wprime_1200.root(1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 4
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: Wprime_1500
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=Wprime_1500
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_Wprime_1500.root(7 workers still sending)   
          Output file: results_Wprime_1500.root(6 workers still sending)   
          Output file: results_Wprime_1500.root(5 workers still sending)   
          Output file: results_Wprime_1500.root(4 workers still sending)   
          Output file: results_Wprime_1500.root(3 workers still sending)   
          Output file: results_Wprime_1500.root(2 workers still sending)   
          Output file: results_Wprime_1500.root(1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 5
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: Wprime_2000
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=Wprime_2000
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_Wprime_2000.root(8 workers still sending)   
          Output file: results_Wprime_2000.root(7 workers still sending)   
          Output file: results_Wprime_2000.root(5 workers still sending)   
          Output file: results_Wprime_2000.root(4 workers still sending)   
          Output file: results_Wprime_2000.root(3 workers still sending)   
          Output file: results_Wprime_2000.root(2 workers still sending)   
          Output file: results_Wprime_2000.root(1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 6
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: ttbar
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=ttbar
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_ttbar.root... | (8 workers still sending)   
          Output file: results_ttbar.root... \ (7 workers still sending)   
          Output file: results_ttbar.root... - (6 workers still sending)   
          Output file: results_ttbar.root... / (5 workers still sending)   
          Output file: results_ttbar.root... | (4 workers still sending)   
          Output file: results_ttbar.root... \ (3 workers still sending)   
          Output file: results_ttbar.root... - (2 workers still sending)   
          Output file: results_ttbar.root... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 7
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: WJets
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=WJets
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_WJets.root... | (8 workers still sending)   
          Output file: results_WJets.root... \ (7 workers still sending)   
          Output file: results_WJets.root... - (6 workers still sending)   
          Output file: results_WJets.root... / (5 workers still sending)   
          Output file: results_WJets.root... | (4 workers still sending)   
          Output file: results_WJets.root... \ (3 workers still sending)   
          Output file: results_WJets.root... - (2 workers still sending)   
          Output file: results_WJets.root... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 8
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: QCD
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=QCD
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_QCD.roots ... | (8 workers still sending)   
          Output file: results_QCD.roots ... \ (7 workers still sending)   
          Output file: results_QCD.roots ... - (6 workers still sending)   
          Output file: results_QCD.roots ... / (5 workers still sending)   
          Output file: results_QCD.roots ... | (4 workers still sending)   
          Output file: results_QCD.roots ... \ (3 workers still sending)   
          Output file: results_QCD.roots ... - (2 workers still sending)   
          Output file: results_QCD.roots ... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 9
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: STsch
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=STsch
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_STsch.root... | (8 workers still sending)   
          Output file: results_STsch.root... \ (7 workers still sending)   
          Output file: results_STsch.root... - (6 workers still sending)   
          Output file: results_STsch.root... / (5 workers still sending)   
          Output file: results_STsch.root... | (4 workers still sending)   
          Output file: results_STsch.root... \ (3 workers still sending)   
          Output file: results_STsch.root... - (2 workers still sending)   
          Output file: results_STsch.root... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 10
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: STtch
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=STtch
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_STtch.root... | (8 workers still sending)   
          Output file: results_STtch.root... \ (7 workers still sending)   
          Output file: results_STtch.root... - (6 workers still sending)   
          Output file: results_STtch.root... / (5 workers still sending)   
          Output file: results_STtch.root... | (4 workers still sending)   
          Output file: results_STtch.root... \ (3 workers still sending)   
          Output file: results_STtch.root... - (2 workers still sending)   
          Output file: results_STtch.root... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 11
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: STtWch
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=STtWch
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_STtWch.root.. - (7 workers still sending)   
          Output file: results_STtWch.root.. / (6 workers still sending)   
          Output file: results_STtWch.root.. | (5 workers still sending)   
          Output file: results_STtWch.root.. \ (4 workers still sending)   
          Output file: results_STtWch.root.. - (3 workers still sending)   
          Output file: results_STtWch.root.. / (2 workers still sending)   
          Output file: results_STtWch.root.. | (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 12
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: STsch_bar
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=STsch_bar
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_STsch_bar.root| (6 workers still sending)   
          Output file: results_STsch_bar.root\ (5 workers still sending)   
          Output file: results_STsch_bar.root- (4 workers still sending)   
          Output file: results_STsch_bar.root\ (2 workers still sending)   
          Output file: results_STsch_bar.root- (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 13
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: STtch_bar
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=STtch_bar
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_STtch_bar.root- (7 workers still sending)   
          Output file: results_STtch_bar.root\ (5 workers still sending)   
          Output file: results_STtch_bar.root- (4 workers still sending)   
          Output file: results_STtch_bar.root/ (3 workers still sending)   
          Output file: results_STtch_bar.root| (2 workers still sending)   
          Output file: results_STtch_bar.root\ (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 14
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: STtWch_bar
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=STtWch_bar
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_STtWch_bar.root (7 workers still sending)   
          Output file: results_STtWch_bar.root (5 workers still sending)   
          Output file: results_STtWch_bar.root (4 workers still sending)   
          Output file: results_STtWch_bar.root (3 workers still sending)   
          Output file: results_STtWch_bar.root (2 workers still sending)   
          Output file: results_STtWch_bar.root (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 15
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: WW
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=WW
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_WW.rootts ... | (8 workers still sending)   
          Output file: results_WW.rootts ... \ (7 workers still sending)   
          Output file: results_WW.rootts ... - (6 workers still sending)   
          Output file: results_WW.rootts ... / (5 workers still sending)   
          Output file: results_WW.rootts ... | (4 workers still sending)   
          Output file: results_WW.rootts ... \ (3 workers still sending)   
          Output file: results_WW.rootts ... - (2 workers still sending)   
          Output file: results_WW.rootts ... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 16
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: WZ
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=WZ
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_WZ.rootts ... | (8 workers still sending)   
          Output file: results_WZ.rootts ... \ (7 workers still sending)   
          Output file: results_WZ.rootts ... - (6 workers still sending)   
          Output file: results_WZ.rootts ... / (5 workers still sending)   
          Output file: results_WZ.rootts ... | (4 workers still sending)   
          Output file: results_WZ.rootts ... \ (3 workers still sending)   
          Output file: results_WZ.rootts ... - (2 workers still sending)   
          Output file: results_WZ.rootts ... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged                                                         
           
          Info in <TProofLite::SetQueryRunning>: starting query: 17
          Info in <TProofQueryResult::SetRunning>: nwrks: 8
          Info in <ACLiC>: unmodified script has already been compiled and loaded
          Info in <Analyzer::Begin>: Histogram names will have suffix: ZJets
          Info in <Analyzer::Begin>: This sample is treated as MC
          Info in <Analyzer::Begin>: starting with process option: sample=ZJets
          Looking up for exact location of files: OK (1 files)                 
          Looking up for exact location of files: OK (1 files)                 
          Info in <TPacketizerAdaptive::TPacketizerAdaptive>: Setting max number of workers per node to 8
          Validating files: OK (1 files)                 
          Info in <TPacketizerAdaptive::InitStats>: fraction of remote files 1.000000
          Output file: results_ZJets.root... | (8 workers still sending)   
          Output file: results_ZJets.root... \ (7 workers still sending)   
          Output file: results_ZJets.root... - (6 workers still sending)   
          Output file: results_ZJets.root... / (5 workers still sending)   
          Output file: results_ZJets.root... | (4 workers still sending)   
          Output file: results_ZJets.root... \ (3 workers still sending)   
          Output file: results_ZJets.root... - (2 workers still sending)   
          Output file: results_ZJets.root... / (1 workers still sending)   
          Info in <Analyzer::Terminate>: Analyzer done.
          Lite-0: all output objects have been merged 
                        
        1. Results: results_STsch_bar.root, results_WW.root, results_Wprime_800.root, results_STtWch.root ,results_WZ.root, results_ZJets.root, results_STtWch_bar.root, results_Wprime_1000.root, results_ttbar.root, results_STtch.root, results_Wprime_1200.root, results_QCD.root, results_STtch_bar.root, results_Wprime_1500.root, results_STsch.root, results_WJets.root, results_Wprime_2000.root.
    3. Modified runAnalysis.C in my 4Top directory:
      1. cd 4Top/TreeAnalyzer/test/
      2. Change runAnalysis.C (for testing porpuses) for ttbar ntuple
        void runAnalysis(TString sample="all",bool NoGUI=false)
        {
          TString desdir = "/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/src/4Top/TreeAnalyzer/test/";
          TProof *p = TProof::Open("lite://", desdir ,desdir);
        
          //p->AddDynamicPath("");
          p->Exec("gSystem->Load(\"/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/lib/slc5_amd64_gcc434/libYumicevaTop7TeV.so\")");
          p->Exec("gSystem->Load(\"/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/lib/slc5_amd64_gcc434/libCondFormatsJetMETObjects.so\")");
          p->AddIncludePath("/uscms/home/algomez/nobackup/work/CMSSW_4_2_4/src/");
        
          p->Archive(" ",desdir);
        
          if (NoGUI) p->SetBit(TProof::kUsingSessionGui);
          if (sample=="MC"||sample=="ttbar"||sample=="all")
            {
              TDSet *mc_ttbar = new TDSet("top","*","/PATNtupleMaker");
              mc_ttbar->Add("/uscms_data/d3/ttmuj/Documents/NtupleMaker/MC/v8_1/TTbar_Mu.root");
              mc_ttbar->Process("Analyzer.C+","sample=ttbar");
            }
        }
                 
      3. Check if such change works:
        1. root -l loadLibraries.C & .x runAnalysis.C("MC",1), obtained just results_ttbar.root.

2011-10-01 - SIM-GEN-RECO for new lhe files and Pythia8

  1. From previous work: two lhe files with 10k events and without cuts (modified fourtop500_v4.lhe and fourtop1000_v10.lhe, both inside /uscms_data/d3/algomez/files/fourtop)
  2. Look for a GenProduction file that works with Pythia8. Found DYToEE_M_50To130_Tune4C_7TeV_pythia8_cff.py in Configuration/GenProduction/python/
    1. In /uscms_data/d3/algomez/pythia8153/CMSSW_4_2_5/src/
    2. cvs co Configuration/GenProduction/python/DYToEE_M_50To130_Tune4C_7TeV_pythia8_cff.py
    3. scramv1 b -j4
    4. cmsDriver.py Configuration/GenProduction/python/DYToEE_M_50To130_Tune4C_7TeV_pythia8_cff.py --filein=file:fourtop500_v4.lhe -s GEN,FASTSIM,HLT:GRun --pileup=FlatDist10_2011EarlyData_50ns --conditions START42_V11::All --beamspot Realistic7TeVCollision --eventcontent=RECOSIM --datatier GEN-SIM-DIGI-RECO -n 10000 --no_exec same configuration as before, only with different file.
    5. Based on previous pythia8ex3_cfg.py, modified the new file DYToEE_M_50To130_Tune4C_7TeV_pythia8_cff_py_GEN_FASTSIM_HLT_PU.py. (only change process.generator) Check with 10 events, and seems fine.
  3. Send process to condor:
    1. mkdir fourtop1000 & mkdir fourtop500 in /uscms_data/d3/algomez/pythia8153/CMSSW_4_2_5/src/
    2. cd fourtop500/
    3. mv ../DYToEE_M_50To130_Tune4C_7TeV_pythia8_cff_py_GEN_FASTSIM_HLT_PU.py fourtop500.py
    4. ln -s /uscms/home/algomez/nobackup/files/fourtop/fourtop500_v4.lhe
    5. Copied previous condor files, modified them and ran.
      1. fourtop500.condor
      2. fourtop500.sh
    6. Same process for fourtop1000.
    7. For checking purposes, run same process for ttbar_v12 and pythia8.

Previous Work:

-- AlejandroGomez - 02-Oct-2011

Edit | Attach | Watch | Print version | History: r43 < r42 < r41 < r40 < r39 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r43 - 2012-09-20 - AlejandroGomez
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback