Vanderbilt SUSY Analyses Page

Madgraph

VBF sBottom Signal Samples

ACCRE (Vanderbilt)

Paste the script below to a setup.sh file and execute it in ACCRE. This will submit 1M VBF sbottom gen-level events for the masses specified below. If you wish to change the number of events generated, uncomment the 'sed' line and specify the number of events.

if   [[ `hostname` = *vmps* ]]; then
  source /cvmfs/cms.cern.ch/cmsset_default.sh
  source /cvmfs/cms.cern.ch/crab/crab.sh
  printf "\n\nLOADED CMS ENVIRONMENT FOR ACCRE \n\n"
elif [[ `hostname` = *fnal* ]]; then
  source /uscmst1/prod/sw/cms/shrc prod
  source /uscmst1/prod/grid/CRAB/crab.sh
  source /uscmst1/prod/grid/gLite_SL5.sh
  printf "\n\nLOADED CMS ENVIRONMENT FOR LPC/FNAL \n\n"
elif [[ `hostname` = *cern* ]]; then
  source /afs/cern.ch/cms/cmsset_default.sh
  printf "\n\nLOADED CMS ENVIRONMENT FOR LXPLUS/CERN \n\n"
fi

printf "\n\n"; #printf %*s $COLUMNS | tr ' ' '-'; printf "\n\n"
printf "%sCMSSW_3_8_7 \n"
printf "\n\n"; #printf %*s $COLUMNS | tr ' ' '-'; printf "\n\n"
   export SCRAM_ARCH=slc5_ia32_gcc434
   printf "%s  SCRAM_ARCH = $SCRAM_ARCH \n\n"
   scramv1 project CMSSW CMSSW_3_8_7
   cd CMSSW_3_8_7/src
   cmsenv

printf "\n\n DOWNLOAD AND UNCOMPRESS MADGRAPH 2.0.0 \n\n"
   wget https://launchpad.net/mg5amcnlo/2.0/2.0.0/+download/MG5_aMC_v2.0.2.tar.gz
   tar -xvf MG5_aMC_v2.0.2.tar.gz
   cd $CMSSW_BASE/src/MG5_aMC_v2_0_2/

printf "\n\n DOWNLOAD AND COMPILE MADANALYSIS & PYTHIA-PGS \n\n"
   echo ' set auto_update 0'                                           >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/setup.mg5
   # echo ' install MadAnalysis '                                        >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/setup.mg5
   # echo ' install pythia-pgs '                                         >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/setup.mg5
   echo ' exit '                                                       >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/setup.mg5
   ./bin/mg5_aMC bin/setup.mg5 | tee $CMSSW_BASE/src/MG5_aMC_v2_0_2/mg5-compile.log

printf "\n\n SET PARTICLE DEFINITIONS AND GENERATE SBOTTOM PROCESSES \n\n"
   echo ' import model mssm '                                          >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' define p = d u s c d~ u~ s~ c~ g '                           >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' define j = p '                                               >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' define l+ = e+ mu+ ta+ '                                     >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' define l- = e- mu- ta- '                                     >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' define vl = ve vm vt '                                       >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' define vl~ = ve~ vm~ vt~ '                                   >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' generate p p > b1 b1~ j j QCD=0, b1 > b n1, b1~ > b~ n1 '    >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' output VBF_sbottom '                                         >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   echo ' exit '                                                       >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/VBF_sbottom.mg5
   ./bin/mg5_aMC bin/VBF_sbottom.mg5

# \cp -vpf /home/delannas/Public/run_card.dat                            $CMSSW_BASE/src/MG5_aMC_v2_0_2/VBF_sbottom/Cards/run_card.dat
sed -e 's:1000000:100:g' /home/delannas/Public/run_card.dat          > $CMSSW_BASE/src/MG5_aMC_v2_0_2/VBF_sbottom/Cards/run_card.dat

echo ' launch VBF_sbottom '                                         >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/launch_VBF_sbottom.mg5
echo ' exit '                                                       >> $CMSSW_BASE/src/MG5_aMC_v2_0_2/bin/launch_VBF_sbottom.mg5

printf "\n\n run_01 => mass(sbottom)=15 GeV, mass(LSP)=10 GeV \n"
printf "Choose \"1 / parton  :  Madevent\" when Madgraph prompts for \"Which programs do you want to run?\" \n"
printf "Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n"
\cp -vpf /home/delannas/Public/sbottom1.dat                            $CMSSW_BASE/src/MG5_aMC_v2_0_2/VBF_sbottom/Cards/param_card.dat
./bin/mg5_aMC bin/launch_VBF_sbottom.mg5

: <<'END'
printf "\n\n run_02 => mass(sbottom)=50 GeV, mass(LSP)=45 GeV \n"
printf "Choose \"1 / parton  :  Madevent\" when Madgraph prompts for \"Which programs do you want to run?\" \n"
printf "Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n"
\cp -vpf /home/delannas/Public/sbottom_50_N1_45.dat                    $CMSSW_BASE/src/MG5_aMC_v2_0_2/VBF_sbottom/Cards/param_card.dat
./bin/mg5_aMC bin/launch_VBF_sbottom.mg5

printf "\n\n run_03 => mass(sbottom)=100 GeV, mass(LSP)=95 GeV \n"
printf "Choose \"1 / parton  :  Madevent\" when Madgraph prompts for \"Which programs do you want to run?\" \n"
printf "Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n"
\cp -vpf /home/delannas/Public/sbottom_100_N1_95.dat                   $CMSSW_BASE/src/MG5_aMC_v2_0_2/VBF_sbottom/Cards/param_card.dat
./bin/mg5_aMC bin/launch_VBF_sbottom.mg5

printf "\n\n run_04 => mass(sbottom)=150 GeV, mass(LSP)=145 GeV \n"
printf "Choose \"1 / parton  :  Madevent\" when Madgraph prompts for \"Which programs do you want to run?\" \n"
printf "Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n"
\cp -vpf /home/delannas/Public/sbottom_150_N1_145.dat                  $CMSSW_BASE/src/MG5_aMC_v2_0_2/VBF_sbottom/Cards/param_card.dat
./bin/mg5_aMC bin/launch_VBF_sbottom.mg5

printf "\n\n run_05 => mass(sbottom)=200 GeV, mass(LSP)=195 GeV \n"
printf "Choose \"1 / parton  :  Madevent\" when Madgraph prompts for \"Which programs do you want to run?\" \n"
printf "Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n"
\cp -vpf /home/delannas/Public/sbottom_200_N1_195.dat                  $CMSSW_BASE/src/MG5_aMC_v2_0_2/VBF_sbottom/Cards/param_card.dat
./bin/mg5_aMC bin/launch_VBF_sbottom.mg5

END   

LPC

Paste the script below to a setup.csh file (note that it is written in csh, not sh) and execute it in LPC. This will submit 1M VBF sbottom gen-level events for the masses specified below. If you wish to change the number of events generated, uncomment the 'sed' line and specify the number of events.

setenv SCRAM_ARCH slc5_ia32_gcc434
echo '\n\n SCRAM_ARCH = ' $SCRAM_ARCH

scramv1 project CMSSW CMSSW_3_8_7
echo '\n\n CMSSW_3_8_7'
cd CMSSW_3_8_7/src
cmsenv

echo '\n\n DOWNLOAD AND UNCOMPRESS MADGRAPH 1.5.13 \n\n'
cp -vp /uscms/home/delannoy/public/MadGraph5_v1.5.13.tar.gz .
tar -xvf MadGraph5_v1.5.13.tar.gz
cd $CMSSW_BASE/src/MadGraph5_v1_5_13/

echo '\n\n DOWNLOAD AND COMPILE MADANALYSIS & PYTHIA-PGS \n\n'
#echo 'Type "n" to bypass an update to a new version when prompted by MADGRAPH  \n\n\n\n'
echo ' set auto_update 0'                                           >> $CMSSW_BASE/src/MadGraph5_v1_5_14/bin/setup.mg5
echo ' install MadAnalysis '                                        >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/setup.mg5
echo ' install pythia-pgs '                                         >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/setup.mg5
echo ' exit '                                                       >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/setup.mg5
./bin/mg5 bin/setup.mg5 | tee $CMSSW_BASE/src/MadGraph5_v1_5_13/mg5-compile.log

echo '\n\n SET PARTICLE DEFINITIONS AND GENERATE SBOTTOM PROCESSES \n\n'
echo ' import model mssm '                                          >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' define p = d u s c d~ u~ s~ c~ g '                           >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' define j = p '                                               >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' define l+ = e+ mu+ ta+ '                                     >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' define l- = e- mu- ta- '                                     >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' define vl = ve vm vt '                                       >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' define vl~ = ve~ vm~ vt~ '                                   >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' generate p p > b1 b1~ j j QCD=0, b1 > b n1, b1~ > b~ n1 '    >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' output VBF_sbottom '                                         >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
echo ' exit '                                                       >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/VBF_sbottom.mg5
./bin/mg5 bin/VBF_sbottom.mg5

cp -vp /uscms_data/d2/freddy06/CMSSW_3_8_7/src/MadGraph5_v1_5_13/run_card.dat                     $CMSSW_BASE/src/MadGraph5_v1_5_13/VBF_sbottom/Cards/run_card.dat
# sed -e 's:1000000:100:g' /uscms_data/d2/freddy06/CMSSW_3_8_7/src/MadGraph5_v1_5_13/run_card.dat > $CMSSW_BASE/src/MadGraph5_v1_5_13/VBF_sbottom/Cards/run_card.dat

echo ' launch VBF_sbottom ' >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/launch_VBF_sbottom.mg5
echo ' exit '               >> $CMSSW_BASE/src/MadGraph5_v1_5_13/bin/launch_VBF_sbottom.mg5

echo '\n\n run_01 => mass(sbottom)=15 GeV, mass(LSP)=10 GeV \n'
echo 'Choose "1 / parton  :  Madevent" when Madgraph prompts for "Which programs do you want to run?" \n'
echo 'Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n'
cp -vp /uscms_data/d2/freddy06/CMSSW_3_8_7/src/MadGraph5_v1_5_13/sbottom1.dat                     $CMSSW_BASE/src/MadGraph5_v1_5_13/VBF_sbottom/Cards/param_card.dat
./bin/mg5 bin/launch_VBF_sbottom.mg5

echo '\n\n run_02 => mass(sbottom)=50 GeV, mass(LSP)=45 GeV \n'
echo 'Choose "1 / parton  :  Madevent" when Madgraph prompts for "Which programs do you want to run?" \n'
echo 'Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n'
cp -vp /uscms_data/d2/freddy06/CMSSW_3_8_7/src/MadGraph5_v1_5_13/sbottom_50_N1_45.dat             $CMSSW_BASE/src/MadGraph5_v1_5_13/VBF_sbottom/Cards/param_card.dat
./bin/mg5 bin/launch_VBF_sbottom.mg5

echo '\n\n run_03 => mass(sbottom)=100 GeV, mass(LSP)=95 GeV \n'
echo 'Choose "1 / parton  :  Madevent" when Madgraph prompts for "Which programs do you want to run?" \n'
echo 'Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n'
cp -vp /uscms_data/d2/freddy06/CMSSW_3_8_7/src/MadGraph5_v1_5_13/sbottom_100_N1_95.dat            $CMSSW_BASE/src/MadGraph5_v1_5_13/VBF_sbottom/Cards/param_card.dat
./bin/mg5 bin/launch_VBF_sbottom.mg5

echo '\n\n run_04 => mass(sbottom)=150 GeV, mass(LSP)=145 GeV \n'
echo 'Choose "1 / parton  :  Madevent" when Madgraph prompts for "Which programs do you want to run?" \n'
echo 'Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n'
cp -vp /uscms_data/d2/freddy06/CMSSW_3_8_7/src/MadGraph5_v1_5_13/sbottom_150_N1_145.dat           $CMSSW_BASE/src/MadGraph5_v1_5_13/VBF_sbottom/Cards/param_card.dat
./bin/mg5 bin/launch_VBF_sbottom.mg5

echo '\n\n run_05 => mass(sbottom)=200 GeV, mass(LSP)=195 GeV \n'
echo 'Choose "1 / parton  :  Madevent" when Madgraph prompts for "Which programs do you want to run?" \n'
echo 'Press RETURN when Madgraph prompts the user to edit the card files \n\n\n\n'
cp -vp /uscms_data/d2/freddy06/CMSSW_3_8_7/src/MadGraph5_v1_5_13/sbottom_200_N1_195.dat           $CMSSW_BASE/src/MadGraph5_v1_5_13/VBF_sbottom/Cards/param_card.dat
./bin/mg5 bin/launch_VBF_sbottom.mg5

: <<'END'
END

Repository

Please collect scripts and cards here:

https://redmine.accre.vanderbilt.edu/projects/vbfsusypheno12/repository

To quick-install Madgraph, execute (even if you use csh)

curl https://raw.github.com/PerilousApricot/vandy-recipes/master/installers/getMadgraph.sh | bash

MadAnalysis will be used to carry out the actual analysis (apply cuts), while pythia and pgs will be used for hadronization and detector simulation respectively. Now we are ready to generate events! You first need to specify the appropriate physics model. In this example, we are going to consider the minimal supersymmetric extension of the standard model (mssm). Next, make appropriate definitions to make the generation of events simpler.

For example, with the definitions below, the incoming partons from the protons that will contribute to the Feynman diagrams will be "d u s c d~ u~ s~ c~ g". The b quark has been ommitted because of its negligible parton distribution function compared to the other quarks and the gluon. Also, anytime a "l-" or "l+" is specified in the subsequent commands, all combinations of electrons, muons, and taus will be considered. Once the model is specified and the definitions are specified, we can generate the process of interest. In this example, we will generate proton-proton collisions resulting in dark matter pair production through vector boson fusion (in this model/scenario, the lightest neutralino is the dark matter candidate), which results in two neutralinos and two jets.

By adding QCD=0, we are limiting the Feynman diagrams to pure electroweak production (since there can be QCD processes that also give two WIMPs and two jets). All the most relevant information to generate events and calculate a cross-section has been uploaded, defined (essentially all the possible Feynman diagrams), and stored in memory. Now we want to create a subdirectory which contains all this information so that we can always go back an generate these type of events without having to go through the above process again.

[delannoy@cmslpc36 MadGraph5_v1_5_12]$ vi bin/VBF_WimpWimp.mg5
   import model mssm
   define p = d u s c d~ u~ s~ c~ g
   define j = p
   define l+ = e+ mu+ ta+
   define l- = e- mu- ta-
   define vl = ve vm vt
   define vl~ = ve~ vm~ vt~
   generate p p > n1 n1 j j $$ w+ w- z a h1 h2 h3 / j ul ur dl dr cl cr sl sr t1 t2 b1 b2 go QCD=0
   output VBF_WimpWimp
[delannoy@cmslpc36 MadGraph5_v1_5_12]$ ./bin/mg5 bin/VBF_WimpWimp.mg5

[delannoy@cmslpc36 MadGraph5_v1_5_12]$ vi cards.csh
   cd VBF_WimpWimp/Cards/
   cp /uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_000.dat ./
   cp /uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_025.dat ./
   cp /uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_050.dat ./
   cp /uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_075.dat ./

Once you exit the Madgraph interface, you should see a directory called "VBF_WimpWimp". cd into this directory ... The next step is to modify the default input parameter files which contain the values of the center of mass energy, masses, branching fractions, etc. The susy masses and branching fractions are specified in the file "Cards/param_card.dat". Some theoretical expertise is required to modify the mass mixing matrices, masses, etc. ... however, some things are simpler to implement/modify. For example, to force charginos to decay to a stau and a neutrino, first find the following line in the param_card.dat file:

DECAY 1000024 XXXX #  wch1

This line specifies the width (XXX) for the susy particle with pdgId of 1000024 (this is the chargino). Add the following line immediately below the above line:

#          BR         NDA      ID1       ID2
     1.00000000E+00    2    -1000015        16   # BR(~chi_1+ -> ~tau_1+  nu_tau)

This lines tells madgraph that the chargino decays to "2" particles with pdgId's of "-1000015" (stau) and "16" (tau neutrino) 100 percent of the time. Similarly, if you then want the stau to decay to a tau and the lightest neutralino, then add the following line:

#          BR         NDA      ID1       ID2
     1.00000000E+00    2     1000022        15   # BR(~tau_1 -> ~chi_10  tau-)

right below the following line:

DECAY 1000015 XXX #  wsl3

NOTE: in some cases, the decay branching ratios are predefined, while in other cases the user has to include the branching ratios by hand as has been described above. In this example, we don't have to worry about branching ratios because the dark matter candidate is stable. Before beginning the generation of events and calculation of cross-sections, we need to choose an appropriate input param_card file (the default param_card usually needs to be replaced ... unless we are generating standard model processes). For the purpose of this example, let's use following param_card.dat files (stored in dcache area at LPC):

/uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_000.dat
/uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_025.dat
/uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_050.dat
/uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/vbfDM_075.dat

replace the default file of choice in

VBF_WimpWimp/Cards/param_card.dat

The final step before generating events and calculating a cross-section is to tell Madgraph the center of mass energy we want to consider, how many events we want to generate, and whether we want to apply any cuts. The file "Cards/run_card.dat" contains all this information. For the purpose of this example, let's use following run_card.dat file:

/uscms_data/d2/freddy06/CMSSW_3_8_6/src/MadGraph/MadGraph5_v1_5_5/VBF_WimpWimp/Cards/run_card.dat

replace the default file in

VBF_WimpWimp/Cards/run_card.dat

with the file above. There are three important parts (for the purpose of this example) to this file:

1)    4000     = ebeam1  ! beam 1 total energy in GeV
       4000     = ebeam2  ! beam 2 total energy in GeV
2)   1000 = nevents ! Number of unweighted events requested
3)    4.2 = deltaeta ! minimum rapidity for two jets in the WBF case

Line 1) specifies the beam energy (in GeV) for each proton. Line 2) specifies the number of events to be generated. For the purpose of a cross-section calculation, 1k or 10k should be sufficient. Line 3) is an important requirement in order to select VBF processes --> "deltaeta" specifies the minimum difference in pseudorapidity between the two highest pt jets. In this case, it is set to 4.2. Once the run_card.dat and param_card.dat files in "VBF_WimpWimp/Cards" have been replaced with the appropriate ones, then we are ready to generate events and calculate a cross-section. From the "MadGraph5_v1_5_5" directory, start the MadGraph interface:

./bin/mg5

Launch the generation of events for this process (VBF dark matter pair production):

mg5> launch VBF_WimpWimp

At this point, Madgraph will ask you what programs you want to run:

Which programs do you want to run?
  0 / auto    : running existing card
  1 / parton  :  Madevent
  2 / pythia  : MadEvent + Pythia.
  3 / pgs     : MadEvent + Pythia + PGS.
 [0, 1, 2, 3, auto, parton, pythia, pgs][60s to answer]

For the purpose of a cross-section calculation, parton level events is sufficient (option "1"). In this example, choose "1". MadGraph will then ask whether the user wants to edit one of the card files:

  1 / param   : param_card.dat (be carefull about parameter consistency, especially widths)
  2 / run     : run_card.dat
  9 / plot    : plot_card.dat
 you can also
   - enter the path to a valid card or banner.
   - use the 'set' command to modify a parameter directly.
     The set option works only for param_card and run_card.
     Type 'help set' for more information on this command.
 [0, done, 1, param, 2, run, 9, plot, enter path][60s to answer]

Since the correct card files are already being used, just press "enter". At this point, the generation begins! The amount of time it takes to finish running depends on the type of process (how many Feynman diagrams does it have to consider) and whether the user is running on a single machine, cluster or multicore. At the end of the generation process, the following output should appear:

  === Results Summary for run: run_01 tag: tag_1 ===

     Cross-section :   XXX +- YYY pb
     Nb of events :  1000

Creating Plots for parton level
End Plots for parton level
store_events
Storing parton level results
End Parton
quit

Exit the MadGraph interface:

mg5> exit

The output cross-section from MadGraph is "XXX" pb. The relevant output has been stored in the "VBF_WimpWimp/Events/run_X" directory, where the "X" in "run_X" is a number which represents how many times you have ran the VBF_WimpWimp code. If this is the first run, then the directory will be called "run_01". Within "VBF_WimpWimp/Events/run_01" there will be a txt file (e.g. "run_01_tag_1_banner.txt") which contains all the information about that particular run, including beam energies, number of events, cross-section, etc. One can always go back to this file to obtain any important information. For example, at the end of this txt file, you can find the following line:

#  Integrated weight (pb)  :       ....

Running MG on the cluster

Fortunately, it's very easy to run MG on the cluster at ACCRE. In your configuration director, change Cards/me5_configuration.txt to look like this (in progress)

# Allow/Forbid the automatic opening of the web browser  (on the status page)   
#when launching MadEvent [True/False]                                              
                                                                                   
automatic_html_opening = False                                                     
                                                                                   
# Default Running mode                                                             
# 0: single machine/ 1: cluster / 2: multicore                                     
                                                                                   
run_mode = 2                                                                       
                                                                                   
# Cluster Type [pbs|sge|condor|lsf|ge] Use for cluster run only                    
# And cluster queue                                                                
                                                                                   
cluster_type = pbs                                                                 
cluster_queue = all           

Then, once that's done, run

VBF_N1N1/bin/generate_events --cluster

. Boom boom, MG will run on the cluster.

Dark Matter Scenarios

There are three main dark matter scenarios:

1. pure wino dark matter (the above example)
2. bino dark matter (neutralino1 is mostly "Z-like" and contains smaller contributions from Wino or Higgsino)
3. higgsino dark matter (neutralino1 is mostly "Higgs-like" and contains smaller contributions from Bino or Wino)

An important question/study is to determine how large are the VBF dark matter production cross-sections for these three scenarios. Furthermore, we need to determine the cross-sections at both 8 TeV and 14 TeV. Below is a summary of the required studies and the lead on each study:

1. 80% bino dark matter scenario at 14 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV. (Paul)
2. 80% bino dark matter scenario at 8 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV (Paul)
3. 80% higgsino dark matter scenario (same as 20% Bino case) at 14 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV. (Andrew)
4. 80% higgsino dark matter scenario (same as 20% Bino case) at 8 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV. (Andrew)
5. 60% bino dark matter scenario at 14 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV. (Eduardo)
6. 60% bino dark matter scenario at 8 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV. (Eduardo)
7. 40% bino dark matter scenario at 14 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV. (Eduardo)
8. 40% bino dark matter scenario at 8 TeV. Calculate cross-section for neutralino1 mass between 110 and 310 GeV. (Eduardo)
9. wino dark matter scenario at 14 TeV. Calculate cross-section for neutralino1 mass between 164 and 350 GeV. (Andres)
10. wino dark matter scenario at 8 TeV. Calculate cross-section for neutralino1 mass between 164 and 350 GeV. (Alfredo)

In order to carry out each of these studies, one can simply follow the same instructions from the above sections, EXCEPT

1. to change the param_card.dat file in "VBF_N1N1/Cards" to one that has the correct neutralino1 mass and correct Bino/Wino/Higgsino percentage. 
2. to modify the run_card.dat file in "VBF_N1N1/Cards" in order to specify the correct center of mass energy

param_card files have already been created for "1" above. One just needs to replace "VBF_N1N1/Cards/param_card.dat" with the appropriate file. The files are located in:

/pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/ParamCards/VBF_*

For example, the file called "VBF_N1=150_BinoPercent=80" represents the 80% bino dark matter scenario, where the dark matter mass is 150 GeV. You can copy it from dcache and replace param_card.dat

[username@cmslpc25 VBF_N1N1]$ /opt/d-cache/dcap/bin/dccp /pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/ParamCards/VBF_N1=150_BinoPercent=80 Cards/

In order to change the center of mass energy between 8 TeV and 14 TeV, the following lines need to be changed in the run_card.dat file:

1)    7000     = ebeam1  ! beam 1 total energy in GeV
       7000     = ebeam2  ! beam 2 total energy in GeV

MadAnalysis

Install MadAnalysis at ACCRE

   if   [[ `hostname` = *vmps* ]]; then
       source /cvmfs/cms.cern.ch/cmsset_default.sh
       source /cvmfs/cms.cern.ch/crab/crab.sh
       tty -s && printf "\n%s\n\n" "LOADED CMS ENVIRONMENT FOR ACCRE"
   elif [[ `hostname` = *fnal* ]]; then
       source /uscmst1/prod/sw/cms/shrc prod
       source /uscmst1/prod/grid/CRAB/crab.sh
       source /uscmst1/prod/grid/gLite_SL5.sh
       tty -s && printf "\n%s\n\n" "LOADED CMS ENVIRONMENT FOR LPC/FNAL"
   elif [[ `hostname` = *cern* ]]; then
       source /afs/cern.ch/cms/cmsset_default.sh
       # source /afs/cern.ch/cms/ccs/wm/scripts/Crab/crab.sh
       tty -s && printf "\n%s\n\n" "LOADED CMS ENVIRONMENT FOR LXPLUS/CERN"
   elif [[ `hostname` = *W530* ]]; then
       export DISPLAY=:0.0
       X :0 -multiwindow -hostintitle #>& /dev/null &
       tty -s && printf "\n%s\n\n" "LOADED X ENVIRONMENT FOR CYGWIN"
   fi

# change to CMSSW_5_3_3 ??
printf "\n\n%*s\n" $COLUMNS | tr ' ' '-'; printf "%s\n" "CMSSW_5_3_5"; printf "%*s\n\n" $COLUMNS | tr ' ' '-'
   export SCRAM_ARCH=slc5_amd64_gcc462
   printf "%s  SCRAM_ARCH = $SCRAM_ARCH \n\n"
   scramv1 project CMSSW CMSSW_5_3_5
   cd CMSSW_5_3_5/src
   cmsenv

# change to CERN patch of MGv1.5
printf "\n\n%*s\n" $COLUMNS | tr ' ' '-'; printf "%s\n" "DOWNLOAD AND UNCOMPRESS MADGRAPH 2.0.0"; printf "%*s\n\n" $COLUMNS | tr ' ' '-'
   wget --no-check-certificate https://launchpad.net/mg5amcnlo/2.0/2.0.0/+download/MG5_aMC_v2.0.2.tar.gz
   tar -xvf MG5_aMC_v2.0.2.tar.gz
   cd $CMSSW_BASE/src/MG5_aMC_v2_0_2/

printf "\n\n%*s\n" $COLUMNS | tr ' ' '-'; printf "%s\n" "DOWNLOAD AND INSTALL MADANALYSIS"; printf "%*s\n\n" $COLUMNS | tr ' ' '-'
   wget --no-check-certificate https://launchpad.net/madanalysis5/trunk/v1.1.10/+download/MadAnalysis5_v1.1.10_patch2.tgz
   tar -xvf MadAnalysis5_v1.1.10_patch2.tgz

setpkgs -a python_2.7.1

cat <<'EOF' > numpy.py
import numpy
exit()
EOF

python numpy.py

# export LD_LIBRARY_PATH=$ROOTSYS/lib:$PYTHONDIR/lib:$LD_LIBRARY_PATH
export   PYTHONDIR=/usr/local/python/2.7.1/opteron/sysgcc/nonet/include/python2.7/
export   LD_LIBRARY_PATH=$ROOTSYS/lib:$PYTHONDIR:$LD_LIBRARY_PATH
export   PYTHONPATH=$ROOTSYS/lib:$PYTHONPATH

cd madanalysis5/
wget --no-check-certificate https://adelannoy.com/CMS/RA2TAU/VBF/vbf_DM-vbf_sbottom/MadAnalysis/vbf_sbottom_015_N1_010-vbfDM_000.ma5 -O $CMSSW_BASE/src/MG5_aMC_v2_0_2/madanalysis5/vbf_sbottom_015_N1_010-vbfDM_000.ma5
wget --no-check-certificate https://adelannoy.com/CMS/RA2TAU/VBF/vbf_DM-vbf_sbottom/MadAnalysis/vbf_sbottom_100_N1_095-vbfDM_050.ma5 -O $CMSSW_BASE/src/MG5_aMC_v2_0_2/madanalysis5/vbf_sbottom_100_N1_095-vbfDM_050.ma5
wget --no-check-certificate https://adelannoy.com/CMS/RA2TAU/VBF/vbf_DM-vbf_sbottom/MadAnalysis/vbf_sbottom_200_N1_195-vbfDM_100.ma5 -O $CMSSW_BASE/src/MG5_aMC_v2_0_2/madanalysis5/vbf_sbottom_200_N1_195-vbfDM_100.ma5

./bin/ma5 vbf_sbottom_015_N1_010-vbfDM_000.ma5
./bin/ma5 vbf_sbottom_100_N1_095-vbfDM_050.ma5
./bin/ma5 vbf_sbottom_200_N1_195-vbfDM_100.ma5

: <<'END'
END

Making Plots

The "MadAnalysis" package is used to analyze events in LHE format (e.g. from running Madgraph) or LHCO format. When a madgraph run is finished, it produces a lhe file containing the information for all events generated. This lhe file can be used as an input to MadAnalysis in order to make plots and apply cuts for phenomenological studies. The MadAnalysis package requires

1. python
2. ROOT
3. g++
4. gfortran
5. pdflatex, latex
6. dvipdf

If you want to run MadAnalysis on your personal computer, make sure to install these packages before installing MadAnalysis. If you are running MadAnalysis at LPC using CMSSW_3_8_6 as discussed in the Madgraph sections above, all the necessary packages should be included with CMSSW. To download MadAnalysis:

wget --no-check-certificate https://madanalysis.irmp.ucl.ac.be/raw-attachment/wiki/WikiStart/ma5_v1.1.5.tgz

untar the download file

tar -xvzf ma5_v1.1.5.tgz

This will create several directories needed to run MadAnalysis (e.g. "bin", "madanalysis", etc). To run MadAnalysis for the first time, use the following command:

./bin/ma5

At first, MadAnalysis will check that all mandatory packages are installed (python, ROOT, and g++). If the mandatory packages are not installed, MadAnalysis will exit and the user will have to install the necessary packages. If all goes well, then it will check for other optional packages such as pdflatex, fastjet, etc. Finally, it will look for the MadAnalysis library ... if the library is found, then the user is ready to analyze events. Exit the MadAnalysis interface:

ma5> exit

The best way to get started with MadAnalysis is to start with a script that works, understand the commands within the script, and modify the script to the suit the user's needs. Let's first start with a script that creates VBF related plots. The script can be downloaded from here:

The first 4 lines of the script are:

# set directory where running "./bin/ma5"; set lumi; define the signal significance
set main.currentdir = /home/gurrola/Desktop
set main.lumi = 100
set main.SBratio = 'S/sqrt(S+B)'

The first line is a comment summarizing the intent of the next three lines. At the beginning of every MadAnalysis script, the user needs to specify the path/directory where they will run MadAnalysis:

set main.currentdir = /home/gurrola/Desktop

This is the same path where the ma5*.tar file was downloaded. The user should modify this line accordingly. The third line specifies the luminosity (in units of inverse femtobarn) that will be used to calculate the expected signal and background rates. The fourth line

set main.SBratio = 'S/sqrt(S+B)'

specifies the definition of "signal significance." The user can replace "S/sqrt(S+B)" with any analytical formula, expressed as a valid Python expression, which indicates how the signal over background ratio must be calculated. When implementing this formula, the symbols related to the signal and background number of events are S and B, respectively. If SBratio is not specified in the script, the signal over background ratio is computed according to S/B.

The next set of commands within the script are the following:

# import samples --> change the path to the LHE file
import samples/signal.lhe as sample_1
import samples/bkg.lhe as sample_2

These commands are telling MadAnalysis the location of the lhe files the user would like to analyse (in the directory "samples") and the user specified name of each dataset (e.g. signal.lhe is given a name of "sample_1"). Once these lhe files (datasets) are imported, MadAnalysis is ready to analyse them. These samples do not come by default with MadAnalysis. The samples need to be produced by running MadGraph.

Create the "samples" directory in your current directory (where ma5*tar was downloaded)

mkdir samples

and copy the *lhe.gz files produced by MadGraph in to the "samples" directory. Then unzip these files:

gunzip signal.lhe.gz
gunzip bkg.lhe.gz

and rename them using the same naming convention in the script (or alternatively change the names in the script accordingly)

The next set of commands

# define bg and signal samples
set sample_1.type = signal
set sample_2.type = background

defines the signal and background datasets. In this case, the signal.lhe file has been defined as "signal", while the bkg.lhe file has been defined as "background."

The next set of commands

# define weights for the samples
set sample_1.weight = 1
set sample_2.weight = 1

# titles for the plots
set sample_1.title = "Signal"
set sample_2.title = "Background"

defines a weight for each sample and the strings used in histogram legends for the dataset under consideration. The possible choices are either auto (the name of the dataset) or any string under a valid TEX form.

The next set of commands

# line styles and colors
set sample_1.linecolor = blue
set sample_1.linestyle = dash-dotted
set sample_1.linewidth = 4
set sample_2.linecolor = black
set sample_2.linestyle = dash
set sample_2.linewidth = 3

defines the colors and styles for the plots. For example, the histograms for the dataset with name "Signal" will have a linecolor of "blue" ... the line will be "dash-dotted" style ... and the line width will be "4". The full set of plotting options can be found in Table 5 of the MadAnalysis user manual.

The remaining set of commands are related to the actual generation of the plot. The line

define jets = j

defines a jet as either a light quark or gluon jet ("j"). The lines

# define which plots to make
plot PT(jets[1])
plot ETA(jets[1])
plot PHI(jets[1])
plot PT(jets[2])
plot ETA(jets[2])
plot PHI(jets[2])
plot DELTAR(jets[1], jets[2])
plot M(jets[1] jets[2])
plot MET
plot sdETA(j[1] j[2])
plot PT(lept[1])
plot ETA(lept[1])
plot PT(lept[2])
plot ETA(lept[2])
plot PT(lept[3])
plot ETA(lept[3])

tells MadAnalysis which quantities the user wants to plot. In this case, "plot PT(jets[1])" tells MadAnalysis to plot the transverse momentum of the leading jet, which is specified as jets[1]. If the user wanted to plot the transverse momentum of all jets, then the command would be "plot PT(jets)". Similarly, "plot ETA(jets[1])" and "plot ETA(jets[2])" tells MadAnalysis to plot the pseudorapidity of the leading and subleading jets. Please note, at this point the user has not specified how to define "leading" and "subleading" jets. By default, MadAnalysis will order objects by their pt, but the user has the flexibility to choose the ordering algorithm (more below). The full list of List of the kinematical observables that can be represented by histograms are listed in Table 8 of the MadAnalysis manual. In the case of the relative distance between two particles (denoted by DELTAR), two particle labels are required:

plot DELTAR(<label1>, <label2>)

Key observables in an analysis are in general related to more than one single particle. All the functions, but the DELTAR observable which requires exactly two arguments, can take an arbitrary number of arguments. This allows us to combine particles before computing the kinematical distribution to be represented. Hence, the command line

plot M(jets[1] jets[2])

leads to the creation of a histogram showing the distribution of the observable "M" (invariant mass). The observable is computed on the basis of the combined four-vector built from the sum of the four-momentum of the particles jets[1] and jets[2]. Before proceeding to the next set of commands in the script, it is first important to understand how MadAnalysis handles the command "plot."

The effect of the command plot is to create a new instance of the class "selection" which has been designed to handle histograms (and cuts ... which will be discussed later). The labeling of the selection objects is internally handled by MadAnalysis.The first time that the command "plot" is issued in a session of MadAnalysis 5, the label "selection[1]" is assigned to the corresponding histogram. The second occurrence of the command "plot" leads to the creation of the object selection[2], and so on. Therefore, the effect of the above lines of code is to create "selection[1]", "selection[2]", "selection[3]", ..., "selection[8]" (one for PT(jets[1]) ... one for ETA(jets[1]) ... and so on). Objects of the selection class have several attributes, which are shown in Table 6 of the MadAnalysis manual. In this example script, we focus on selection objects related to histograms. The case of cuts will be discussed later. The number of bins, the value of the lowest bin of a histogram and the one of its highest bin are stored in the attributes "nbins", "xmin" and "xmax", respectively. Their values are fixed at the time the command plot is issued in the interpreter. They can however be further modified by the use of the command set, as for any other attribute of an object, by typing in the command interface, e.g.,

set selection[<i>].xmax = 100

This command allows us to set the value of the highest bin of the histogram associated to the object selection[i] to 100. For the full list of attirubutes, please see Table 6 of the MadAnalysis manual. For the purpose of this example, we merely describe the attributes related to the commands used in the script. For example, the following lines of code from the script

# plot parameters
set selection[1].xmax = 1000
set selection[1].xmin = 0
set selection[1].nbins = 50
set selection[1].logY = true
set selection[1].logX = false
set selection[1].rank = PTordering
set selection[1].stacking_method = normalize2one
set selection[1].titleX = "p_{T}[j_{1}] (GeV)"

refer to attributes related to the first plot (pt of the leading jet). The following lines

set selection[1].xmax = 1000
set selection[1].xmin = 0
set selection[1].nbins = 50

set the x-axis histogram range to 0 --> 100 with 50 bins. The following lines

set selection[1].logY = true
set selection[1].logX = false

sets the x-axis to normal scale and the y-axis to log scale. The following line

set selection[1].rank = PTordering

tells MadAnalysis that the objects used for plot #1 (in the case plot 1 is using jets) will be ordered by pt. Therefore, jets[1] will refer to the jet with the highest transverse momentum. Other ordering options are

ETAordering
ETordering
Eordering
Pordering
PXordering
PYordering
PZordering

The following line

set selection[1].stacking_method = normalize2one

allows the user to change the employed stacking method. The use of "normalize2one" means that the the normalization of each curve is set to one and all the histograms for signal and background datasets are drawn supersimposed. The other allowed choices are stack and superimpose. In the first case, the curves are all stacked, whilst in the second case, they are superimposed. Finally, the line

set selection[1].titleX = "p_{T}[j_{1}] (GeV)"

allows the user to change the title of the x-axis. The remaining set of commands in the script are the following:

set selection[2].xmax = 8
set selection[2].xmin = -8
set selection[2].nbins = 160
set selection[2].logY = false
set selection[2].logX = false
set selection[2].rank = PTordering
set selection[2].stacking_method = normalize2one
set selection[2].titleX = "#eta[j_{1}]"
set selection[3].xmax = 3.2
set selection[3].xmin = -3.2
set selection[3].nbins = 64
set selection[3].logY = false
set selection[3].logX = false
set selection[3].rank = PTordering
set selection[3].stacking_method = normalize2one
set selection[3].titleX = "#phi[j_{1}]"
set selection[4].xmax = 500
set selection[4].xmin = 0
set selection[4].nbins = 100
set selection[4].logY = true
set selection[4].logX = false
set selection[4].rank = PTordering
set selection[4].stacking_method = normalize2one
set selection[4].titleX = "p_{T}[j_{2}] (GeV)"
set selection[5].xmax = 8
set selection[5].xmin = -8
set selection[5].nbins = 160
set selection[5].logY = false
set selection[5].logX = false
set selection[5].rank = PTordering
set selection[5].stacking_method = normalize2one
set selection[5].titleX = "#eta[j_{2}]"
set selection[6].xmax = 3.2
set selection[6].xmin = -3.2
set selection[6].nbins = 64
set selection[6].logY = false
set selection[6].logX = false
set selection[6].rank = PTordering
set selection[6].stacking_method = normalize2one
set selection[6].titleX = "#phi[j_{2}]"
set selection[7].xmax = 15
set selection[7].xmin = 0
set selection[7].nbins = 75
set selection[7].logY = false
set selection[7].logX = false
set selection[7].rank = PTordering
set selection[7].stacking_method = normalize2one
set selection[7].titleX = "#Delta#eta[j_{1},j_{2}]"
set selection[8].xmax = 8000
set selection[8].xmin = 0
set selection[8].nbins = 160
set selection[8].logY = false
set selection[8].logX = false
set selection[8].rank = PTordering
set selection[8].stacking_method = normalize2one
set selection[8].titleX = "M[j_{1},j_{2}] (GeV)"
set selection[9].xmax = 1000
set selection[9].xmin = 0
set selection[9].nbins = 100
set selection[9].logY = true
set selection[9].logX = false
set selection[9].rank = PTordering
set selection[9].stacking_method = normalize2one
set selection[9].titleX = "#slash{E}_{T} (GeV)"
set selection[10].stacking_method = normalize2one
set selection[10].titleX = "#Delta#phi(j_{1},j_{2})"
set selection[11].xmax = 1000
set selection[11].xmin = 0
set selection[11].nbins = 200
set selection[11].logY = true
set selection[11].logX = false
set selection[11].rank = PTordering
set selection[11].stacking_method = normalize2one
set selection[11].titleX = "p_{T}[l_{1}] (GeV)"
set selection[12].xmax = 4
set selection[12].xmin = -4
set selection[12].nbins = 80
set selection[12].logY = false
set selection[12].logX = false
set selection[12].rank = PTordering
set selection[12].stacking_method = normalize2one
set selection[12].titleX = "#eta[l_{1}]"
set selection[13].xmax = 1000
set selection[13].xmin = 0
set selection[13].nbins = 200
set selection[13].logY = true
set selection[13].logX = false
set selection[13].rank = PTordering
set selection[13].stacking_method = normalize2one
set selection[13].titleX = "p_{T}[l_{2}] (GeV)"
set selection[14].xmax = 4
set selection[14].xmin = -4
set selection[14].nbins = 80
set selection[14].logY = false
set selection[14].logX = false
set selection[14].rank = PTordering
set selection[14].stacking_method = normalize2one
set selection[14].titleX = "#eta[l_{2}]"
set selection[15].xmax = 1000
set selection[15].xmin = 0
set selection[15].nbins = 200
set selection[15].logY = true
set selection[15].logX = false
set selection[15].rank = PTordering
set selection[15].stacking_method = normalize2one
set selection[15].titleX = "p_{T}[l_{3}] (GeV)"
set selection[16].xmax = 4
set selection[16].xmin = -4
set selection[16].nbins = 80
set selection[16].logY = false
set selection[16].logX = false
set selection[16].rank = PTordering
set selection[16].stacking_method = normalize2one
set selection[16].titleX = "#eta[l_{3}]"

They are very similar to the commands used for "selection[1]" except that they refer to a different plot. For example, selection[2] refers to ETA(jets[1]), the pseudorapidity of the leading jet.

To run the script, type the following command:

./bin/ma5 plots.ma5

The commands within the script will run sequentially. Once the selection is defined, it has to be executed through a job to be run by MadAnalysis. This is done through the command submit which takes as an argument the name of a directory which will be created,

ma5> submit <dirname>

The created directory contains a series of C++ source and header files that are necessary for MadAnalysis to properly run. The compilation, linking to the external static library of MadAnalysis and the execution of the resulting code is handled by MadAnalysis. The screen output indicates the status of these different tasks and various information such as the detected format of the event samples or the number of processed events. In the case anything is not going as smoothly as it should, MadAnalysis also prints warning and/or error messages to the user and the program exits in the worst case scenario. If all goes well, the user should see a message similar to the one below:

   Checking SampleAnalyzer output...
   Extracting data from the output files...
   Preparing an HTML report...
     ** Preparing data for the report...
     ** Saving all plots as image files...
     ** Computing cut efficiencies...
     ** Generating the HTML report...
   To open this HTML report, please type 'open'.
   Preparing a PDF report...
     ** Preparing data for the report...
     ** Saving all plots as image files...
     ** Computing cut efficiencies...
     ** Generating the PDF report...
   To open this PDF report, please type 'open <dirname>/PDF'.
   Preparing a DVI report...
     ** Preparing data for the report...
     ** Saving all plots as image files...
     ** Computing cut efficiencies...
     ** Generating the DVI report...
     ** Converting the DVI report to a PDF report.
   To open this PDF report, please type 'open <dirname>/DVI'.
   Well done ! Elapsed time = 45 seconds 

MadAnalysis has now created a directory with name "dirname" which contains all the plots and information. For example, if the user types:

ma5> open

An HTML webpage will automatically open displaying the requested plots. The user can exit the MadAnalysis interface with

ma5> exit

The png's for the plots are located in the directories "dirname/HTML" and "dirname/PDF". The user can always go back to the MadAnalysis interface and display the HTML or the PDF report as follows

./bin/ma5
ma5> open <dirname>/HTML
ma5> open >dirname>/PDF

For more details on the user of MadAnalysis, please see the user's manual.

Applying Cuts

In addition to creating plots, MadAnalysis also allows the user to apply cuts, calculate selection efficiencies, and calculate expected signal and background rates. The best way to get started with applying cuts with MadAnalysis is to use a script that works, understand the commands within the script, and modify the script to suit the user's needs. Let's first start with the same example script used in the previous section:

plots.ma5

The first 179 lines of code are similar to the commands discussed in the section above. The important commands in terms of applying cuts begin on line 180. To apply cuts in MadAnalysis, the user will need to use the "select" or "reject" command. The use of "select" and "reject" is as follows:

select <condition>
reject <condition>

In the case of "select", the event or object will be kept if the "condition" is true, and the event of object will be "thrown away" if the "condition" is false. Therefore, lines 180-181 of the plots.ma5 script

select PT(jets[1]) > 30 and ETA(jets[1]) > -5 and ETA(jets[1]) < 5
select PT(jets[2]) > 30 and ETA(jets[2]) > -5 and ETA(jets[2]) < 5

will tell MadAnalysis to keep an event if the "leading jet has pt greater 30 GeV, AND the leading jet has eta greater than -5, AND the leading jet has eta less than 5." It will also tell MadAnalysis to keep an event if the "subleading jet has pt greater 50 GeV, AND the subleading jet has eta greater than -5, AND the subleading jet has eta less than 5." Note that, as was the case in the sections above, the definition of "leading" and "subleading" is by default based on pt, but the user has flexibility to choose the ordering algorithm. As was the case with the use of the command "plot", the effects of the commands above are to create instances of the "selection" class with special properties. Therefore, the first line above ("select PT(jets[1]) > 30 ...") creases the object selection[1]. The second line creates the object selection[2]. Contrary to the command "plot" which is related to the creation of histograms, the commands "select" and "reject" lead to the production of tables of cut efficiencies. Consequently, the only attribute (for our purposes) of the selection class which is relevant is "rank". The ordering used to determine the definition of "leading" and "subleading" when applying cuts can be specified as

set selection[1].rank = PTordering
set selection[2].rank = PTordering

The next line of code in the script

select DELTAR(jets[1], jets[2]) > 4.2

selects events where the deltaR between the two leading jets is greater than 4.2. The next command

select M(jets[1] jets[2]) > 750

apply dijet mass cuts (using the two leading jets) of greater than 750. For example, "select M(jets[1] jets[2]) > 750" will keep events where the invariant mass of the two leading jets is greater than 750 GeV.

To run the plots.ma5 script, type the following command:

./bin/ma5 plots.ma5

The commands within the script will run sequentially. Once the selection is defined, it has to be executed through a job to be run by MadAnalysis. This is done through the command submit which takes as an argument the name of a directory which will be created,

ma5> submit <dirname>

The created directory contains a series of C++ source and header files that are necessary for MadAnalysis to properly run. The compilation, linking to the external static library of MadAnalysis and the execution of the resulting code is handled by MadAnalysis. The screen output indicates the status of these different tasks and various information such as the detected format of the event samples or the number of processed events. In the case anything is not going as smoothly as it should, MadAnalysis also prints warning and/or error messages to the user and the program exits in the worst case scenario. If all goes well, the user should see a message similar to the one below:

   Checking SampleAnalyzer output...
   Extracting data from the output files...
   Preparing an HTML report...
     ** Preparing data for the report...
     ** Saving all plots as image files...
     ** Computing cut efficiencies...
     ** Generating the HTML report...
   To open this HTML report, please type 'open'.
   Preparing a PDF report...
     ** Preparing data for the report...
     ** Saving all plots as image files...
     ** Computing cut efficiencies...
     ** Generating the PDF report...
   To open this PDF report, please type 'open <dirname>/PDF'.
   Preparing a DVI report...
     ** Preparing data for the report...
     ** Saving all plots as image files...
     ** Computing cut efficiencies...
     ** Generating the DVI report...
     ** Converting the DVI report to a PDF report.
   To open this PDF report, please type 'open <dirname>/DVI'.
   Well done ! Elapsed time = 45 seconds 

MadAnalysis has now created a directory with name "dirname" which contains all the plots, cut efficiencies, and other information. For example, if the user types:

ma5> open

An HTML webpage will automatically open displaying the requested plots. The user can exit the MadAnalysis interface with

ma5> exit

The png's for the plots are located in the directories "dirname/HTML" and "dirname/PDF". The user can always go back to the MadAnalysis interface and display the HTML or the PDF report as follows

./bin/ma5
ma5> open <dirname>/HTML
ma5> open >dirname>/PDF

For more details on the user of MadAnalysis, please see the user's manual.

Example Script for the VBF Dark Matter Analysis

There is an example MadAnalysis script to carry out the VBF dark matter analysis in the dcache are at LPC:

/pnfs/cms/WAX/11/store/user/freddy06/gurrola/MadAnalysis/PlottingScripts/VBFscript_darkMatterAnalysis.ma5

The samples that are used as inputs are located in:

/pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/TTBar_fl_0j/ttbar_fl_ptj30_etmiss50_8TeV.lhe.gz
/pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/TTBar_sl_0j/ttbar_sl_ptj30_etmiss50_8TeV.lhe.gz
/pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/WpTolnu_2j/WpTolnu_2j_ptj30_etmiss50_8TeV.lhe.gz
/pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/ZToNuNu_2j/ZToNuNu_2j_ptj30_etmiss50_8TeV.lhe.gz
/pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/VBF_N1N1/VBF_N1N1_mass50_ptj30_etmiss50_8TeV.lhe.gz
/pnfs/cms/WAX/11/store/user/freddy06/gurrola/Madgraph/VBF_N1N1/VBF_N1N1_mass100_ptj30_etmiss50_8TeV.lhe.gz

Create the "samples/VBFDarkMatterStudy" directory (since this is the directory name used in the script) in your current directory (where ma5*tar was downloaded)

mkdir samples
cd samples
mkdir VBFDarkMatterStudy

and copy the *lhe.gz files from dcache to the "samples/VBFDarkMatterStudy" directory. Then unzip these files and rename them using the same naming convention in the script (or alternatively change the names in the script accordingly). To run the VBFcuts.ma5 script, type the following command:

./bin/ma5 VBFscript_darkMatterAnalysis.ma5

Example Script for Classic RA2Tau Analysis

There is an example MadAnalysis script to make plots for the "classic" ra2tau analysis in the dcache are at LPC:

/pnfs/cms/WAX/11/store/user/freddy06/gurrola/MadAnalysis/PlottingScripts/classicRA2plots.ma5

The samples that are used as inputs are located in:

/pnfs/cms/WAX/11/store/user/freddy06/gurrola/SMS_LHEfiles/T2gluinoneutralinoorchargino_1000_100_0.75.lhe

Installation for 5_2_3_patch3

If you are working on FNAL machine, first prepare your environment:

source /uscmst1/prod/sw/cms/cshrc prod

In order to use the code, you need to create a CMSSW_5_2_3_patch3 working area:

setenv SCRAM_ARCH slc5_amd64_gcc462
scramv1 project CMSSW CMSSW_5_2_3_patch3
cd CMSSW_5_2_3_patch3/src
source /uscmst1/prod/grid/gLite_SL5.csh
cmsenv
source /uscmst1/prod/grid/CRAB/crab.csh

You will need to check out some specific packages. In order to do so, first log in to cvs as an anonymous user:

cmscvsroot CMSSW
cvs login

You will be prompted to enter a password. Use the following password: 98passwd From cvs, checkout the following packages:

cvs co -r V00-03-07-01   CommonTools/ParticleFlow
cvs co -r V04-05-08      JetMETCorrections/Type1MET
cvs co -r CMSSW_5_2_3_patch3 PhysicsTools/CandUtils
cvs up -r 1.3 PhysicsTools/CandUtils/src/EventShapeVariables.cc
cvs co -r V08-08-25-01   PhysicsTools/PatAlgos 
cvs co -r b5_2_X_cvMEtCorr_2012Apr10 PhysicsTools/PatUtils
cvs co -r CMSSW_5_2_3_patch3 RecoMET/METAlgorithms
cvs up -r 1.2 RecoMET/METAlgorithms/interface/SigInputObj.h
cvs co -r V15-01-05      RecoParticleFlow/PFProducer 
cvs co -r V06-07-09      TopQuarkAnalysis/TopObjectResolutions
cvs co -d SHarper/HEEPAnalyzer UserCode/SHarper/HEEPAnalyzer
cvs co -r V01-04-17 RecoTauTag/RecoTau
cvs co -r V01-04-03 RecoTauTag/Configuration
cvs co -r V00-04-01 CondFormats/EgammaObjects
cvs up -r 1.53 PhysicsTools/PatAlgos/python/tools/tauTools.py
cvs up -r 1.12 PhysicsTools/PatAlgos/python/producersLayer1/tauProducer_cff.py
cvs up -r 1.15 PhysicsTools/PatAlgos/python/recoLayer0/tauDiscriminators_cff.py
cvs co -d HighMassAnalysis/Skimming -r for52x_05072012 UserCode/AlfredoGurrola/HighMassAnalysis/Skimming
cvs co -d HighMassAnalysis/Configuration -r for52x UserCode/AlfredoGurrola/HighMassAnalysis/Configuration
cvs co -d HighMassAnalysis/Analysis -r for52x_06192012 UserCode/AlfredoGurrola/HighMassAnalysis/Analysis

Compile:

scram build -c
scram b -j8

Producing PatTuples

Running Interactively: Testing PatTuple Producers

An example configuration file for creating patTuples exists in:

HighMassAnalysis/Configuration/test/Data_TauTauSkim/hiMassTau_patProd.py 

The very first thing the user should check is that everything runs successfully interactively. To run a quick test, first make the following modifications to hiMassTau_patProd.py :

1) Change the following lines:

process.maxEvents = cms.untracked.PSet(
    input = cms.untracked.int32( 100 )
)

to the ones below:

process.maxEvents = cms.untracked.PSet(
    input = cms.untracked.int32( 500 )
)

Changing 100 to 500 means that you will run over ONLY 500 events (-1 is for to run over all events). Next, open the file:

HighMassAnalysis/Configuration/python/hiMassSetup_cfi.py 

This file only exists in pre 52X versions. In versions >= 52X, the parameters previously defined in hiMassSetup_cfi.py are defined in the main configuration file, hiMassTau_patProd.py. This file defines the type of data sample that you want to create. If you want to run over a real collision data sample, then you MUST have the following variables defined:

signal = False
data = True

If you want to create patTuples beginning from a Monte Carlo RECO sample, then make the following definitions:

signal = False
data = False

NOTE: In the High Mass Tau group, we do NOT skim our signal samples. Therefore, If you do NOT want to skim a Monte Carlo sample, set:

signal = True

This will ensure that no skimming criteria is applied. Finally, one must define the type of skimming.

1) Muon + Tau (loose mutau pair)

channel = "mutau"

2) Electron + Tau (loose etau pair)

channel = "etau"

3) Tau + Tau (loose ditau pair)

channel = "tautau"

4) Electron + Muon (loose emu pair)

channel = "emu"

5) Muon + Tau + Tau (loose ditau pair + muon)

channel = "mutautau"

6) Electron + Tau + Tau (loose ditau pair + electron)

channel = "electautau"

7) SUSY (skim for collision data based on HLT_PFMHT150)

channel = "susy"

Once those modifications have been made, then you can test that everything works properly by running interactively:

cmsRun hiMassTau_patProd.py

Please look at all the output to make sure that no errors exist. Note that the default configuration file runs over a TTJets MC sample located at FNAL. One might want to replace input PoolSource to include the appropriate sample.

Creating PatTuples via CRAB and Writing out to Dcache

First, the user needs to verify that they indeed have a dcache area at FNAL:

ls /pnfs/cms/WAX/11/store/user/<USERNAME>

If your store/user area does not exist, please follow the instructions in the following link to request one:

http://www.uscms.org/uscms_at_work//software_computing/tier2/store_user.shtml

Alternatively, allowed users can write out to the RA2Tau dcache area at FNAL:

ls /pnfs/cms/WAX/11/store/user/ra2tau

If the proper /store/user area exists, then create the following crab.cfg file:

[CRAB]
jobtype                 = cmssw
scheduler               = condor

[CMSSW]
datasetpath             = /ZPrime500TauTau_Tauola_GenSimRaw/eluiggi-ZPrime500TauTau_Tauola_GenSimReco-e775361b37e05a960708195f2a24a820/USER
dbs_url                 = http://cmsdbsprod.cern.ch/cms_dbs_ph_analysis_01/servlet/DBSServlet
use_parent              = 0
pset                    = hiMassTau_patProd.py
total_number_of_events  = -1
number_of_jobs          = 50
output_file             = skimPat.root

[USER]
ui_working_dir          = SkimPat
return_data             = 0
copy_data               = 1
check_user_remote_dir = 0
storage_element         = cmssrm.fnal.gov
storage_path = /srm/managerv2?SFN=/11
user_remote_dir=/store/user/<USERNAME>
publish_data            = 1
publish_data_name       = SkimPat
dbs_url_for_publication = https://cmsdbsprod.cern.ch:8443/cms_dbs_ph_analysis_01_writer/servlet/DBSServlet
srm_version = srmv2
#additional_input_files          = Jec10V3.db

[GRID]
rb                      = CERN
proxy_server            = fg-myproxy.fnal.gov
se_white_list =fnal
virtual_organization    = cms
retry_count             = 0

This configuration file is designed to create Monte Carlo patTuples. Make sure the appropriate USERNAME is replaced in the configuration above. NOTE: for versions >= 52X, the crab parameter

additional_input_files

is not needed and should be removed. To create your crab jobs:

crab -create -cfg crab.cfg

To submit your crab jobs:

crab -submit -c <CRAB_UI_NAME>

NOTE: The appropriate directory name must be substituted above.

To check the status of your jobs and retrieve the output:

crab -status -c <CRAB_UI_NAME>
crab -getoutput -c <CRAB_UI_NAME>

Notice that the above configuration is set up such that output files are published to DBS so that they may be accessed via CRAB at a later time. To publish your dataset:

crab -publish -c <CRAB_UI_NAME>

Running the EDAnalyzer

First, make sure the version of the EDAnalyzer is up to date:

cvs co -d HighMassAnalysis/Analysis -r for52x_06192012 UserCode/AlfredoGurrola/HighMassAnalysis/Analysis
cd HighMassAnalysis/Analysis
scramv1 b clean
scram build -c
scramv1 b

Make sure to run the "cvs co" command above from CMSSW_5_2_3_patch3/src . An example configuration file called "NoCuts.py" exists in the test/SusyJetsMetTaus directory. This file has several hundred configurable parameters. However, the default version of the "NoCuts.py" python file defines the parameters such that no event level selection is applied. This means that NO events are thrown away. The only object level requirements are:

  • Matching the reco/pat taus to generator level visible hadronic taus
  • Matching the reco/pat muons to generator level muons
  • Matching the reco/pat electrons to generator level electrons
  • reco/pat jets must have pt > 10 (CMS recommends that all analyses use jets above 10 GeV in pt)
These basic object level requirements are imposed so that the histograms reflect the "true" object distributions. To run the EDAnalyzer:

cd test/SusyJetsMetTaus
cmsRun NoCuts.py

The default python file uses as input a stau pair signal sample (see discussion of signal samples below) which contains 1k events:

process.source = cms.Source("PoolSource",
    skipEvents = cms.untracked.uint32(0),
    fileNames = cms.untracked.vstring(
'file:/uscms_data/d2/freddy06/CMSSW_5_2_3_patch3/src/HighMassAnalysis/Configuration/test/Data_TauTauSkim/StauPair_StauMass100_LSPMass25_skimPat.root'
)
)

This piece of python code above is located in lines 19-24 of NoCuts.py. A root file called "analysis.root" is produced by the EDAnalyzer which contains a few hundred histograms. A few of the important histograms are listed below:

  • Events_0 --> contains the number of events processes/analyzed (in bin 1 of the histogram) and the number of events which passed the event level selection (in bin 2 of the histogram)
  • NVertices_0 --> the number of reconstructed primary vertices
  • NGenTau_0 --> the number of generator level hadronically decaying tau leptons
  • GenTauPt_0 --> the visible generator level pt distribution of hadronically decaying tau leptons
  • NTau1_0 --> the number of reco/pat taus passing the "tau1" object level selections (in NoCuts.py, the object level selections are matching to a gen tau)
    the EDAnalyzer defines two types of taus: (1) taus passing a first set of object level selections; (2) taus passing a second set of object level selections. "Tau1" and "Tau2" object level selections are usually used to define e.g. a "loosely defined tau" and a "tightly defined tau"
  • TauJet1Pt_0 --> the reco/pat tau pt distribution for those candidates passing the "tau1" object level selections
  • NMuon1_0 --> the number of reco/pat muons passing the "muon1" object level selections (in NoCuts.py, the object level selections are matching to a gen muon)
    the EDAnalyzer defines two types of muons: (1) muons passing a first set of object level selections; (2) muons passing a second set of object level selections. "Muon1" and "Muon2" object level selections are usually used to define e.g. a "loosely identified muon" and a "tightly identified tau"
  • Muon1Pt_0 --> the reco/pat muon pt distribution for those candidates passing the "muon1" object level selections
  • NJet_0 --> Number reco/pat jets passing the object level selections (in NoCuts.py, the object level selections are pt > 10)
  • JetPt_0 --> the reco/pat jet pt distribution for those candidates passing the jet object level selections (in NoCuts.py, the object level selections are pt > 10)
  • NBJet_0 --> the number of jets tagged as b-jets using the track high efficiency "medium" working point and having pt > 20 & eta < 2.4
  • Met_0 --> missing transverse energy
  • MHT_0 --> estimate of the missing transverse energy: transverse momentum of the vector sum of jets (in NoCuts.py, only jets with pt > 10 are considered)
  • HT_0 --> estimate of visible energy: scalar sum of the transverse momentum of jets (in NoCuts.py, only jets with pt > 10 are considered)
  • DiTauReconstructableMass_0 --> invariant mass of all combinations of tau pairs (only taus passing object level selections are considered)
  • DiMuonReconstructableMass_0 --> invariant mass of all combinations of muon pairs (only muons passing object level selections are considered)
  • DiElectronReconstructableMass_0 --> invariant mass of all combinations of electron pairs (only electrons passing object level selections are considered)

Instructions on making your own Signal Samples

Grab the required software:

cmsrel CMSSW_5_2_2
cd CMSSW_5_2_2/src

Get required packages and change the Pythia6Interface so that it spits out the LHE file:

cmsenv
cvs co -r CMSSW_5_2_2 GeneratorInterface/Pythia6Interface/
cvs co -d UserCode/Scans/ UserCode/crsilk/Scans/
mv UserCode/Scans/setup/Pythia6Hadronizer* GeneratorInterface/Pythia6Interface/plugins/

Compile:

scram b -j 4

How To Run

To create the signal samples, 3 important files are needed: (1) SLHA file (contains the SUSY masses, parameters, branching fractions, etc.), (2) LHE file (contains the information for the generated events --> the PYTHIA list of gen particles (list of gen particles can be found here) per event and their properties), and (3) configuration file to go from the gen LHE file to AODSIM.

In the directory 'UserCode/Scans' there is an file called exampleScan.cfg. This (as the name indicates) is an example configuration file for createScan.py. To run:

python createScan.py exampleScan.cfg

Hopefully the variable names are self explanatory but if not there are comments in the exampleScan.cfg that tell you how it works. Running createScan.py will give you a directory set by the variable in the configuration file named 'model_tag'. Within the model_tag directory there are three directories: SLHA, LHE , and AODSIM. You first need to create the SLHA files, to do this:

cd model_tag/SLHA
python createSLHAs.py

This should create all slha files and put them in the directory 'files' that is located within SLHA. The next step is the create the LHE files. The file 'createLHEs.py' creates the LHE files, but first needs to be modified to include the correct center of mass energy. To do this, open the file and change 'energy = ENERGY' --> 'energy = 8000.0'. Then, to create the LHE files:

cd ../LHE
python createLHEs.py $JOBNUMBER

Where the $JOBNUMBER is a number between 0 and int((# of scan points)/(points per file)) + 1. For example, if you do an "ls -lhrt" in the 'SLHA/files' directory, you will see the following list of files:

T2bqqp_450_50_0.25.slha
T2bqqp_450_50_0.5.slha
T2bqqp_450_50_0.75.slha
T2bqqp_450_100_0.25.slha
T2bqqp_450_100_0.5.slha
T2bqqp_450_150_0.25.slha
T2bqqp_450_150_0.5.slha
T2bqqp_450_200_0.25.slha
T2bqqp_450_200_0.5.slha
T2bqqp_500_50_0.25.slha
T2bqqp_500_50_0.5.slha
T2bqqp_500_50_0.75.slha
T2bqqp_500_100_0.25.slha
T2bqqp_500_100_0.5.slha
T2bqqp_500_100_0.75.slha
T2bqqp_500_150_0.25.slha
T2bqqp_500_150_0.5.slha
T2bqqp_500_200_0.25.slha
T2bqqp_500_200_0.5.slha

If you want to create the LHC file for T2bqqp_450_50_0.75.slha, then this corresponds to the third file in this list, which means $JOBNUMBER should be 2:

cd ../LHE
python createLHEs.py 2

The next step is to create an AODSIM sample from the LHE files. To do this, cd to the AODSIM directory and modify the main configuration file: 'LHEToAODSIM_cfg.py'. Replace the following line:

process.source = cms.Source("UserCode.Scans.T2bqqp_source_cff")

with

process.source = cms.Source(
        "LHESource",

        fileNames = cms.untracked.vstring(),
        )
process.source.fileNames.extend([

'file:/uscms_data/d2/freddy06/CMSSW_5_2_2/src/UserCode/Scans/T2bqqp/LHE/T2bqqp_450_100_0.5To450_150_0.25.lhe',
])

The file path above for T2bqqp_450_100_0.5To450_150_0.25.lhe should be modified to the appropriate one. The lhe file used above should have been transferred to the directory you specified in the your exampleScan.cfg configuration file, under LHE called output_directory. Then, to run:

cmsRun LHEToAODSIM_cfg.py

An AODSIM root file is created called "GEN-fragment_GEN_FASTSIM_HLT_PU.root"

Modifying exampleScan.cfg for a User Defined Signal

To define a specific signal process, the file "exampleScan.cfg" in the previous section should be modified. This section outlines how this file should be modified in order to generate events with stau pair production, where each of the staus decays to a tau and a neutralino LSP. First, change "model_tag" to whatever name you want to give your signal process:

model_tag = T2stau

Then, define the number of events the user wants to generate per scan point:

events_per_point = 1000

Define the number of scan points per AODSIM root file:

points_per_file = 1

Define the center of mass energy:

energy = 8000.0

Define the SUSY particles that will be involved in the user defined signal process. For the case of stau pair production proceeding to decay to a tau and a neutralino LSP (pp-->stau1+stau1-->tau+Chi_1^0+tau+Chi_1^0):

involved_susy_pdgIds = 1000015 1000022

1000015 is the pdgId for "stau1", while 1000022 is the pdgId for Chi_1^0. The full list of pdgId's can be found here.

Define the possible decays of the non-stable SUSY particles. In this case, since the neutralino LSP is stable, then only the stau1 decay needs to be defined. Since the stau decays to the neutralino LSP plus a tau, the decay is defined in the following way:

decay_of_1000015: 1.00000000E+00  1000022 15

1.00000000E+00 is the user defined branching ratio. 1000022 is the pdgId of the first daughter particle (neutralino LSP in this case). 15 is the pdgId of the second daughter particle (the tau). This code is designed to produce a SUSY "scan" with varying values of the SUSY masses. Therefore, the user can specify which SUSY masses will be varied:

scan_parameter_names = M1000015 M1000022

M1000015 represents the mass of the stau1, while M1000022 represents the mass of the neutralino LSP. The minimum and maximum masses for the SUSY particles can be set:

scan_parameter_mins = 50 25
scan_parameter_maxs = 200 175

as well as the step sizes used to generate the scan points:

scan_parameter_steps = 25 25

In this example, the min value for M1000015 is 50, the maximum value is 200, and the step size is 25. Therefore, the stau1 masses will be 50, 75, 100, ..., 200. The mass of the SUSY parameters can be functions of the parameters. In this case, we will set the SUSY masses exactly equal to the parameters:

mass_definition_of_1000015 = M1000015
mass_definition_of_1000022 = M1000022

A certain subset of the generated scan points can be thrown away by defining cuts:

#IMPORTANT: it throws out point if statement is true!!
cut_1 = M1000022 - M1000015 >= 0
cut_2 = False

In this case, the generated events are ONLY KEPT if the neutralino LSP mass (M1000022) is LESS THAN the stau1 mass (M1000015). Next, the allowed production "subprocesses" must defined in PYTHIA. In this example, we want to generate direct stau1 pairs. Therefore, the correct PYTHIA subprocess is 207:

allowed_subprocesses = 207

The PYTHIA subprocess are defined in tables 2, 3, and 4 of the following paper: here. For example, Table 4 states that slepton pair production should be subprocesses 201-214. Then from table 2, we can see that stau1 pair production is subprocess 207. The final modification that must be made to exampleScan.cfg is to define the directory where the LHE file will be stored:

output_directory = /uscms_data/d2/freddy06/CMSSW_5_2_2/src/UserCode/Scans/T2stau/LHE

Example Signal Configuration Files

Configuration files have been created for the following signal processes:

  • pp-->stau1+stau1-->(tau+chi10)+(tau+chi10) - cfg file
  • pp-->stau1+snu_tau-->(tau+chi10)+(stau1+W)-->(tau+chi10)+(tau+chi10+lepton/tau+neutrino) - cfg file
  • pp-->stop1+stop1-->(top+chi20)+(top+chi20)-->(top+tau+tau+chi10)+(top+tau+tau+chi10) - cfg file
  • pp-->sbottom1+sbottom1-->(bottom+chi20)+(bottom+chi20)-->(bottom+tau+tau+chi10)+(bottom+tau+tau+chi10) - cfg file
  • pp-->gluino+gluino-->(qqbar+chi20)+(qqbar+chi20)-->(qqbar+higgs+chi10)+(qqbar+higgs+chi10) - cfg file

Example Signal AODSIM/PatTuples

  • pp-->stau1+stau1-->(tau+chi10)+(tau+chi10) - /uscms_data/d2/lpcjm/willhf/forFreddy/StauPair_StauMass100_LSPMass25_*root (~ 1k events)
  • pp-->stau1+snu_tau-->(tau+chi10)+(stau1+W)-->(tau+chi10)+(tau+chi10+lepton/tau+neutrino) - /uscms_data/d2/lpcjm/willhf/forFreddy/StauSnu_StauMass100_SnuMass100_LSPMass25_*.root (~ 1k events)
  • pp-->stop1+stop1-->(top+chi20)+(top+chi20)-->(top+tau+tau+chi10)+(top+tau+tau+chi10) - /uscms_data/d2/lpcjm/willhf/forFreddy/StopPair_StopMass500_LSPMass100_*root (~ 1k events)
  • pp-->sbottom1+sbottom1-->(bottom+chi20)+(bottom+chi20)-->(bottom+tau+tau+chi10)+(bottom+tau+tau+chi10) - /uscms_data/d2/lpcjm/willhf/forFreddy/SbottomPair_SbottomMass500_LSPMass100_*root (~ 1k events)
  • pp-->gluino+gluino-->(qqbar+chi20)+(qqbar+chi20)-->(qqbar+higgs+chi10)+(qqbar+higgs+chi10) - /uscms_data/d2/lpcjm/willhf/forFreddy/GluinoPairToNeutralinoToHiggs_GluinoMass500_LSPMass100_*root (~ 1k events)

SUSY Stop/Sbottom Pair Production Cross Sections at 7 TeV

Interactions - stops/sbottoms Nominal (pb) Uncertainty
100 GeV 415.828 16.49%
105 GeV 331.346 16.4131%
110 GeV 266.339 16.3833%
115 GeV 215.66 16.1196%
120 GeV 175.302 16.1302%
125 GeV 143.073 15.9623%
130 GeV 118.022 16.2372%
135 GeV 97.6805 15.9443%
140 GeV 81.2429 15.8428%
145 GeV 67.9725 15.791%
150 GeV 57.1202 15.6968%
155 GeV 48.1982 15.6716%
160 GeV 40.928 15.6929%
165 GeV 34.8756 15.618%
170 GeV 29.8407 15.4257%
175 GeV 25.6264 15.4276%
180 GeV 22.1161 15.2942%
185 GeV 19.1814 15.2858%
190 GeV 16.6419 15.218%
195 GeV 14.4617 15.4529%
200 GeV 12.6437 15.1791%
205 GeV 11.0729 15.2081%
210 GeV 9.68853 15.1238%
215 GeV 8.52455 15.069%
220 GeV 7.51002 15.0524%
225 GeV 6.63265 15.0579%
230 GeV 5.87402 15.0089%
235 GeV 5.20441 14.977%
240 GeV 4.63075 14.956%
245 GeV 4.12699 14.8652%
250 GeV 3.67931 14.8885%
255 GeV 3.28152 14.8787%
260 GeV 2.93988 14.9244%
265 GeV 2.63333 14.7395%
270 GeV 2.36221 14.6963%
275 GeV 2.12618 14.7896%
280 GeV 1.91469 14.677%
285 GeV 1.71927 14.7479%
290 GeV 1.55401 14.7401%
295 GeV 1.40844 14.6955%
300 GeV 1.27307 14.6571%
305 GeV 1.15249 14.936%
310 GeV 1.04712 14.7962%
315 GeV 0.950222 14.613%
320 GeV 0.863828 14.6464%
325 GeV 0.786674 14.6423%
330 GeV 0.716705 14.6433%
335 GeV 0.653654 14.5953%
340 GeV 0.597143 14.6164%
345 GeV 0.545465 14.6234%
350 GeV 0.499108 14.5889%
355 GeV 0.456647 14.5074%
360 GeV 0.418776 14.5105%
365 GeV 0.383775 14.4933%
370 GeV 0.351945 14.4644%
375 GeV 0.323002 14.3823%
380 GeV 0.297147 14.2705%
385 GeV 0.27332 14.4732%
390 GeV 0.251444 14.3651%
395 GeV 0.232271 14.3933%
400 GeV 0.214149 14.4971%
405 GeV 0.198068 14.5886%
410 GeV 0.182913 14.6479%
415 GeV 0.168801 14.7922%
420 GeV 0.156704 14.8368%
425 GeV 0.144628 14.936%
430 GeV 0.134507 15.0065%
435 GeV 0.124392 15.0804%
440 GeV 0.115348 15.1762%
445 GeV 0.107245 15.2485%
450 GeV 0.0992878 15.3598%
455 GeV 0.0922374 15.4593%
460 GeV 0.0857745 15.5334%
465 GeV 0.0797201 15.6494%
470 GeV 0.074147 15.7169%
475 GeV 0.06902 15.8284%
480 GeV 0.0642831 15.93%
485 GeV 0.0599245 15.9919%
490 GeV 0.0558958 16.1214%
495 GeV 0.0520416 16.1967%
500 GeV 0.0486111 16.2961%
505 GeV 0.0453899 16.4142%
510 GeV 0.0424444 16.4723%
515 GeV 0.0396183 16.5823%
520 GeV 0.0370806 16.6772%
525 GeV 0.0346581 16.779%
530 GeV 0.032434 16.8941%
535 GeV 0.0303045 16.979%
540 GeV 0.0283853 17.09%
545 GeV 0.0266518 17.1611%
550 GeV 0.0249376 17.2882%
555 GeV 0.0233198 17.4061%
560 GeV 0.0218991 17.4857%
565 GeV 0.0204807 17.602%
570 GeV 0.0192652 17.7247%
575 GeV 0.0180582 17.815%
580 GeV 0.0169375 17.8905%
585 GeV 0.0159235 17.9941%
590 GeV 0.0149107 18.1352%
595 GeV 0.0139931 18.2184%
600 GeV 0.0131832 18.3415%
605 GeV 0.012371 18.4562%
610 GeV 0.0116553 18.5271%
615 GeV 0.0109457 18.6577%
620 GeV 0.0103307 18.7359%
625 GeV 0.00974176 18.8389%
630 GeV 0.00917932 18.9953%
635 GeV 0.00864832 19.081%
640 GeV 0.00815037 19.1935%
645 GeV 0.00768851 19.2821%
650 GeV 0.00725254 19.4046%
655 GeV 0.00684582 19.5179%
660 GeV 0.00645651 19.5913%
665 GeV 0.00609033 19.7352%
670 GeV 0.00574425 19.8541%
675 GeV 0.0054262 19.9319%
680 GeV 0.00512151 20.0595%
685 GeV 0.00483639 20.1696%
690 GeV 0.00456873 20.269%
695 GeV 0.00431485 20.397%
700 GeV 0.00407922 20.543%
705 GeV 0.00385437 20.7088%
710 GeV 0.00363411 21.024%
715 GeV 0.00343542 21.3248%
720 GeV 0.00325313 21.4323%
725 GeV 0.0030734 21.7289%
730 GeV 0.00289954 21.8929%
735 GeV 0.0027426 22.2368%
740 GeV 0.00259042 22.357%
745 GeV 0.00245409 22.7108%
750 GeV 0.00232201 22.8421%
755 GeV 0.00220039 22.9585%
760 GeV 0.00207964 23.0442%
765 GeV 0.00197349 23.3438%
770 GeV 0.00186178 23.4758%
775 GeV 0.00176796 23.8014%
780 GeV 0.00167315 24.187%
785 GeV 0.00158305 24.2273%
790 GeV 0.00150278 24.2264%
795 GeV 0.00142308 24.2236%
800 GeV 0.00134929 24.6211%
805 GeV 0.00127968 24.6118%
810 GeV 0.00121664 24.9464%
815 GeV 0.001146 25.0668%
820 GeV 0.00109168 25.5931%
825 GeV 0.00103191 25.6051%
830 GeV 0.00097957 25.8401%
835 GeV 0.000929598 26.0089%
840 GeV 0.000882133 26.2825%
845 GeV 0.000836883 26.5091%
850 GeV 0.000794034 26.7155%
855 GeV 0.000753828 26.9611%
860 GeV 0.000715947 27.1655%
865 GeV 0.000679601 27.492%
870 GeV 0.000645279 27.6113%
875 GeV 0.000613128 27.9303%
880 GeV 0.000582599 28.1075%
885 GeV 0.000553763 28.4493%
890 GeV 0.000526242 28.6668%
895 GeV 0.000499869 28.8264%
900 GeV 0.000475078 29.1309%
905 GeV 0.000451249 29.4369%
910 GeV 0.000429159 29.5663%
915 GeV 0.000407424 29.8668%
920 GeV 0.000387417 30.0411%
925 GeV 0.000367999 30.2853%
930 GeV 0.000350566 30.5219%
935 GeV 0.00033313 30.7917%
940 GeV 0.000316901 31.064%
945 GeV 0.000301022 31.0908%
950 GeV 0.000286345 31.4967%
955 GeV 0.000272071 31.7112%
960 GeV 0.000259103 31.8937%
965 GeV 0.000246539 32.2745%
970 GeV 0.000234421 32.4596%
975 GeV 0.000223443 32.5352%
980 GeV 0.000213111 32.9153%
985 GeV 0.000202769 33.2562%
990 GeV 0.00019276 33.3546%
995 GeV 0.000183492 33.7537%
1000 GeV 0.000175283 34.0129%

SUSY Gluino Pair Production Cross Sections at 7 TeV

Interactions - gg Nominal (pb) Uncertainty
200 GeV 686.71 15.38%
210 GeV 525.375 15.3228%
220 GeV 405.592 15.3082%
230 GeV 316.138 15.2381%
240 GeV 248.588 15.4234%
250 GeV 197.147 15.3524%
260 GeV 157.381 15.2956%
270 GeV 126.242 15.4013%
280 GeV 102.061 15.321%
290 GeV 82.9324 15.2954%
300 GeV 67.7727 15.2727%
310 GeV 55.6423 15.2442%
320 GeV 45.8633 15.3339%
330 GeV 37.9996 15.3152%
340 GeV 31.6144 15.2607%
350 GeV 26.3811 15.152%
360 GeV 22.1078 15.1235%
370 GeV 18.6183 15.007%
380 GeV 15.7739 15.1758%
390 GeV 13.3569 15.0473%
400 GeV 11.312 15.3031%
410 GeV 9.65419 15.4624%
420 GeV 8.26498 15.6264%
430 GeV 7.07342 15.8679%
440 GeV 6.06777 16.0938%
450 GeV 5.23576 16.2317%
460 GeV 4.53029 16.3641%
470 GeV 3.93168 16.52%
480 GeV 3.41682 16.7332%
490 GeV 2.96401 16.9943%
500 GeV 2.57471 17.2337%
510 GeV 2.24603 17.324%
520 GeV 1.96757 17.3505%
530 GeV 1.72285 17.456%
540 GeV 1.50818 17.5648%
550 GeV 1.32839 17.6444%
560 GeV 1.16682 17.8153%
570 GeV 1.03113 18.0364%
580 GeV 0.9041 18.356%
590 GeV 0.798946 18.6058%
600 GeV 0.708282 18.798%
610 GeV 0.628668 18.9101%
620 GeV 0.557783 19.0453%
630 GeV 0.494469 19.3372%
640 GeV 0.437997 19.583%
650 GeV 0.389416 19.5909%
660 GeV 0.346669 19.5948%
670 GeV 0.309309 19.7875%
680 GeV 0.276228 20.0413%
690 GeV 0.247284 20.1767%
700 GeV 0.220801 20.3607%
710 GeV 0.197476 20.6845%
720 GeV 0.176358 21.0244%
730 GeV 0.157216 21.3505%
740 GeV 0.14118 21.573%
750 GeV 0.126669 21.544%
760 GeV 0.113848 21.8976%
770 GeV 0.102336 22.2753%
780 GeV 0.0919813 22.6291%
790 GeV 0.0825767 22.9014%
800 GeV 0.0742372 23.1465%
810 GeV 0.0669329 23.4873%
820 GeV 0.0603161 23.8363%
830 GeV 0.0542539 24.1319%
840 GeV 0.0488994 24.4629%
850 GeV 0.0441988 24.8589%
860 GeV 0.039913 25.2481%
870 GeV 0.0360318 25.5958%
880 GeV 0.0325317 25.8004%
890 GeV 0.0294761 26.2469%
900 GeV 0.0266846 26.6141%
910 GeV 0.0241501 27.0897%
920 GeV 0.0218999 27.4649%
930 GeV 0.0197574 27.5332%
940 GeV 0.017902 27.6969%
950 GeV 0.016256 28.2727%
960 GeV 0.0147608 28.4955%
970 GeV 0.0133436 28.6031%
980 GeV 0.0120925 29.053%
990 GeV 0.0109694 29.3447%
1000 GeV 0.010005 29.9658%
1010 GeV 0.00910196 30.4554%
1020 GeV 0.00827399 30.8501%
1030 GeV 0.00750995 31.1014%
1040 GeV 0.00681815 31.3751%
1050 GeV 0.00620925 31.7883%
1060 GeV 0.00565936 32.1766%
1070 GeV 0.00515679 32.6654%
1080 GeV 0.00469765 33.1567%
1090 GeV 0.00427254 33.4758%
1100 GeV 0.0038872 33.876%
1110 GeV 0.00354097 34.0442%
1120 GeV 0.00323058 34.5328%
1130 GeV 0.00294578 35.1255%
1140 GeV 0.00268518 35.5241%
1150 GeV 0.0024525 36.0255%
1160 GeV 0.00224185 36.3938%
1170 GeV 0.00204818 36.9158%
1180 GeV 0.00186502 37.5041%
1190 GeV 0.00170481 38.1101%
1200 GeV 0.00155387 38.5875%
1210 GeV 0.00141393 38.4438%
1220 GeV 0.00128925 38.7928%
1230 GeV 0.00117793 39.261%
1240 GeV 0.00107739 39.8584%
1250 GeV 0.000984752 40.3122%
1260 GeV 0.000899159 40.763%
1270 GeV 0.000821609 41.263%
1280 GeV 0.000751039 41.7855%
1290 GeV 0.000686521 42.3056%
1300 GeV 0.000627546 42.6905%
1310 GeV 0.000573612 43.0767%
1320 GeV 0.0005242 43.5575%
1330 GeV 0.00047993 44.0233%
1340 GeV 0.000438376 44.5677%
1350 GeV 0.000400158 45.0371%
1360 GeV 0.000365803 45.5623%
1370 GeV 0.000334723 45.9639%
1380 GeV 0.000306614 46.41%
1390 GeV 0.000280438 46.9459%
1400 GeV 0.000256511 47.3506%
1410 GeV 0.000234156 47.9423%
1420 GeV 0.000213943 48.4201%
1430 GeV 0.000195577 49.0402%
1440 GeV 0.00017871 49.2777%
1450 GeV 0.000163235 50.0097%
1460 GeV 0.000149244 50.3599%
1470 GeV 0.000137 51.0621%
1480 GeV 0.000125074 51.4265%
1490 GeV 0.000114059 51.7895%
1500 GeV 0.000104 52.2002%
1510 GeV 9.51779e-05 52.8091%
1520 GeV 8.71976e-05 53.3689%
1530 GeV 7.96653e-05 53.8866%
1540 GeV 7.28352e-05 54.3822%
1550 GeV 6.66422e-05 54.984%
1560 GeV 6.10247e-05 55.5827%
1570 GeV 5.57513e-05 56.1334%
1580 GeV 5.08628e-05 56.6013%
1590 GeV 4.64415e-05 57.0641%
1600 GeV 4.24235e-05 57.5142%
1610 GeV 3.87584e-05 57.9813%
1620 GeV 3.54176e-05 58.5134%
1630 GeV 3.23165e-05 59.2289%
1640 GeV 2.95128e-05 59.9254%
1650 GeV 2.69595e-05 60.4233%
1660 GeV 2.46327e-05 60.7883%
1670 GeV 2.25482e-05 61.3548%
1680 GeV 2.06525e-05 62.012%
1690 GeV 1.88292e-05 62.4697%
1700 GeV 1.71141e-05 62.8347%
1710 GeV 1.56124e-05 63.3687%
1720 GeV 1.42946e-05 64.0297%
1730 GeV 1.30559e-05 64.62%
1740 GeV 1.19224e-05 65.1976%
1750 GeV 1.09117e-05 65.8462%
1760 GeV 9.90654e-06 66.2128%
1770 GeV 9.0191e-06 66.7496%
1780 GeV 8.21824e-06 67.2244%
1790 GeV 7.50189e-06 67.7902%
1800 GeV 6.84682e-06 68.257%
1810 GeV 6.22294e-06 68.7407%
1820 GeV 5.65933e-06 69.2895%
1830 GeV 5.15833e-06 69.8641%
1840 GeV 4.70566e-06 70.3636%
1850 GeV 4.29375e-06 70.8532%
1860 GeV 3.91302e-06 71.485%
1870 GeV 3.55463e-06 72.0262%
1880 GeV 3.22902e-06 72.4824%
1890 GeV 2.93885e-06 73.1105%
1900 GeV 2.67615e-06 73.5061%
1910 GeV 2.43258e-06 73.9748%
1920 GeV 2.21505e-06 74.5532%
1930 GeV 2.01346e-06 75.0526%
1940 GeV 1.82314e-06 75.45%
1950 GeV 1.65608e-06 75.9728%
1960 GeV 1.5106e-06 76.577%
1970 GeV 1.36562e-06 76.978%
1980 GeV 1.24072e-06 77.4705%
1990 GeV 1.12625e-06 77.9387%
2000 GeV 1.02497e-06 78.4708%

SUSY Stop Pair Production Cross Sections at 8 TeV

Interactions - gg + ffbar Nominal (pb) Uncertainty
500 GeV 0.04386 %

SUSY Sbottom Pair Production Cross Sections at 8 TeV

Interactions - gg + bb + ffbar Nominal (pb) Uncertainty
500 GeV 0.04332 %

SUSY Gluino Pair Production Cross Sections at 8 TeV

Interactions - gg + qqbar Nominal (pb) Uncertainty
500 GeV 1.996 %
750 GeV 0.1044 %
1000 GeV 0.008983 %

SUSY Stau Pair Production Cross Sections at 8 TeV

Interactions - qqbar Nominal (pb) Uncertainty
75 GeV 0.1073

SUSY Stau+Snu Production Cross Sections at 8 TeV

Interactions - qqbar (stau,snu) Nominal (pb) Uncertainty
100 GeV, 100 GeV 0.03668
100 GeV, 150 GeV 0.01605

Connecting to the LPC from a Linux or Mac OSX PC

http://uscms.org/uscms_at_work/physics/computing/getstarted/uaf.shtml

To connect to the LPC cluster you need to have kerberos and openssh with gss support installed on your system. This is already included in Scientific Linux and Mac OSX. In addtion you will need get the krb5.conf file for Fermilab:

http://security.fnal.gov/krb5.conf

and save it to your home directory. Then run

sudo cp krb5.conf /etc/krb5.conf 

Also make the following edit to ~/.ssh/config on your local machine:

GSSAPIAuthentication yes
GSSAPIDelegateCredentials yes
To connect to the LPC cluster:

Get an addressless and forwardable kerberos ticket for the FNAL.GOV kerberos realm:

/usr/kerberos/bin/kinit -A -f user@FNAL.GOV
or for MAC's Leopard:
/usr/kerberos/bin/kinit -A -f user@FNAL.GOV
Snow Leopard:
/usr/kerberos/bin/kinit -a -f user@FNAL.GOV
You will be prompted for your kerberos password in the FNAL.GOV realm.

To verify that you have an addressless and forwardable kerberos ticket:

klist -a -f

Mac OS X 10.7 and later

In the Applications folder, open the Utilities folder. Then, open Keychain Access.

From the Keychain Access menu, select Ticket Viewer.

Click Add Identity. In the "Identity:" field, enter your Network ID username in the format username@FNAL.GOV . Then enter your passphrase in the "Password:" field. Note: FNAL.GOV must be in all capital letters.

Click Continue to get your initial Kerberos ticket.

To make this your default Kerberos identity, click Set as Default.

Connect to the SL5 cluster:

ssh username@cmslpc-sl5.fnal.gov 

VBF light stop search

Proposal

I've been thinking about a small physics project for Bradley. There's a new analysis I want to start at CMS, it's the 8 TeV search for a light stop (m(stop) ~ m(top)) with VBF.

Of course the main part of the analysis is understanding the ttbar background (both QCD and EWK production) with the data itself. Therefore, one place Bradley could start is to work with Andres on estimating the ttbar background in the context of the analyses that Andres is working on (Susy Higgs and VBF Dark Matter). The BG estimation methodology/philosophy they will develop together should be largely applicable to a future VBF Stop analysis.

Bradley would be able to go through the full fledged analysis workflow: (1) MC generation, (2) running CRAB, (3) creating patTuples, (4) skimming and analyzing the data for control samples, (5) understanding the plots, (6) systematics, (7) triggers, etc. He would be able to do these things without actually doing an entire analysis. Other advantages include continuing to utilize our expertise with taus and b-jets, utilizing Andrew's experience and expertise with top, and also piggy backing on our recent success with Higgs and VBF.

- AGurrola

Instructions for using PBS scheduler with CRAB

Instructions are here: http://cms.accre.vanderbilt.edu/info/runjobs.php

- AMelo

B2G PAT recipe

The recipe lives here:

https://twiki.cern.ch/twiki/bin/view/CMS/B2GTopLikeBSM53X#Version_3_53x_post_Moriond_versi

- AMelo

TLBSM dataset @ vandy

/T_t-channel_TuneZ2star_8TeV-powheg-tauola/StoreResults-Summer12_DR53X-PU_S10_START53_V7A-v1_TLBSM_53x_v3-99bd99199697666ff01397dad5652e9e/USER

- AMelo

Susy/Exo Tau patTuple recipe

https://twiki.cern.ch/twiki/bin/viewauth/CMS/SUSYJetsMETTausAnalysis2012#Installation_for_5_3_7

- AGurrola

Running ED Analyzer on LPC Using CRAB

In order to be able to store large outputs, change directory to your data area on LPC, (e.g. "/uscms_data/d3/brack228/") and then follow the steps in the link above.

(Note: data areas are not backed up in any way, so be careful)

In addition, to avoid errors while using CRAB, append the following line to the file: ".profile", in your home directory :

source /uscmst1/prod/sw/cms/shrc prod

(Note: ".profile" may not exist yet, in which case just create it and save it to your home directoy)

SUSY Tau datasets

Here are the datasets we grabbed. they are now all at Vandy

/TTJets_scaleup_TuneZ2star_8TeV-madgraph-tauola/Summer12_DR53X-PU_S10_START53_V7A-v1/AODSIM /TTJets_matchingup_TuneZ2star_8TeV-madgraph-tauola/Summer12_DR53X-PU_S10_START53_V7A-v1/AODSIM /TTJets_scaledown_TuneZ2star_8TeV-madgraph-tauola/Summer12_DR53X-PU_S10_START53_V7A-v1/AODSIM /TTJets_matchingdown_TuneZ2star_8TeV-madgraph-tauola/Summer12_DR53X-PU_S10_START53_V7A-v1/AODSIM

- PSheldon

Bradley, can you test the HighMassAnalysis framework that Andres is using for his studies? Hopefully it compiles straight out of the box. If so, then you can interactively run over a small subset of events to make sure it runs successfully and you can see an output root file that contains some histograms. If running interactively works, then you can submit crab jobs to run over a high statistics dataset. I created and stored some patTuples at Vanderbilt that can be used to test the analysis framework. Here is one:

/ZJetsToNuNu_50_HT_100_TuneZ2Star_8TeV_madgraph/gurrola-ZJetsToNuNu_50_HT_100_Pat-8a1da05381c9cc2b284a32eaeaa05cc5/USER

Andres, do you have time to send Bradley your updated code (.cc and .h file along with an example configuration file that works on MC)?

Bradley, once you have these .cc and .h files, then you can replace the default ones in:

HighMassAnalysis/Analysis/src/*cc HighMassAnalysis/Analysis/interface/*h

and recompile. Then you can use the python configuration file from Andres to run over some events.

- AGurrola

Here are the latest .cc & .h files:

https://adelannoy.com/CMS/RA2TAU/ZJets-Estimate/Gurrola/HiMassTauAnalysis.cc

https://adelannoy.com/CMS/RA2TAU/ZJets-Estimate/Gurrola/HiMassTauAnalysis.h

As Alfredo mentioned, these should replace the default ones in: HighMassAnalysis/Analysis/src/*cc HighMassAnalysis/Analysis/interface/*h

You can use this .py config file:

https://adelannoy.com/CMS/RA2TAU/ZJets-Estimate/Gurrola/Cut.py

You can play around and change whatever you like in the .py file.

- ADelannoy

Setup CMSSW_5_3_7 environment in LPC

See here for instructions on how to setup kerberos access to the LPC cluster at FNAL.

[delannas@vpac11 ~]$ kinit delannoy@FNAL.GOV
   Password for delannoy@FNAL.GOV:
[delannas@vpac11 ~]$ ssh -X delannoy@cmslpc-sl5.fnal.gov
[delannoy@cmslpc39 ~]$ echo $SHELL
   /bin/tcsh
[delannoy@cmslpc39 ~/VBF]$ pwd
   /uscms/home/delannoy/VBF
[delannoy@cmslpc39 ~/VBF]$ vi ~/.cshrc
   source /uscmst1/prod/sw/cms/cshrc prod
   source /uscmst1/prod/grid/gLite_SL5.csh
   source /uscmst1/prod/grid/CRAB/crab.csh
   setenv CVSROOT ":pserver:anonymous@cmssw.cvs.cern.ch:/local/reps/CMSSW"
[delannoy@cmslpc39 ~/VBF]$ source ~/.cshrc
[delannoy@cmslpc39 ~/VBF]$ cp -p /uscms/home/delannoy/public/2013-10-15-PhysicsPlotter_VBF.zip ./
[delannoy@cmslpc39 ~/VBF]$ unzip 2013-10-15-PhysicsPlotter_VBF.zip -d PhysicsPlotter_VBF/
[delannoy@cmslpc39 ~/VBF]$ vi setup_CMSSW_5_3_7.csh
   setenv SCRAM_ARCH slc5_amd64_gcc462
   echo $SCRAM_ARCH
   scramv1 project CMSSW CMSSW_5_3_7_patch4
   cd CMSSW_5_3_7_patch4/src
   cmsenv
[delannoy@cmslpc39 ~/VBF]$ source setup_CMSSW_5_3_7.csh

[delannoy@cmslpc39 src]$ cvs login
   Logging in to :pserver:anonymous@cmssw.cvs.cern.ch:2401/local/reps/CMSSW
   CVS password: 98passwd
[delannoy@cmslpc39 src]$ vi cvs_pkgs.csh
   addpkg DataFormats/PatCandidates V06-05-06-05
   addpkg PhysicsTools/PatAlgos     V08-09-51
   addpkg DataFormats/StdDictionaries V00-02-14
   addpkg FWCore/GuiBrowsers V00-00-70
   addpkg RecoParticleFlow/PFProducer V15-02-06
   addpkg RecoTauTag/RecoTau V01-04-23
   addpkg RecoTauTag/Configuration V01-04-10
   addpkg CondFormats/EgammaObjects V00-04-00
   addpkg JetMETCorrections/Type1MET V04-06-09
   addpkg PhysicsTools/PatUtils V03-09-23
   addpkg CommonTools/ParticleFlow V00-03-16                              
   addpkg CommonTools/RecoAlgos V00-03-23      
   addpkg CommonTools/RecoUtils V00-00-13  
   addpkg DataFormats/ParticleFlowCandidate V15-03-03
   addpkg DataFormats/TrackReco V10-02-02      
   addpkg DataFormats/VertexReco V02-00-04      
   cvs co -d SHarper/HEEPAnalyzer UserCode/SHarper/HEEPAnalyzer
   cvs co -d HighMassAnalysis/Skimming -r for537_02182013 UserCode/AlfredoGurrola/HighMassAnalysis/Skimming
   cvs co -d HighMassAnalysis/Configuration -r for537_02182013 UserCode/AlfredoGurrola/HighMassAnalysis/Configuration
   cvs co -d HighMassAnalysis/Analysis -r forVBFSusy_07222012 UserCode/AlfredoGurrola/HighMassAnalysis/Analysis
[delannoy@cmslpc39 src]$ source cvs_pkgs.csh | tee cvs_pkgs.log
[delannoy@cmslpc39 src]$ scram build -c
[delannoy@cmslpc39 src]$ scram b -j8 | tee scram.log

[delannoy@cmslpc39 src]$ vi update_code.csh
   rm -v $CMSSW_BASE/src/HighMassAnalysis/Analysis/src/HiMassTauAnalysis.cc
   cp -vp /uscms/home/delannoy/public/HiMassTauAnalysis.cc $CMSSW_BASE/src/HighMassAnalysis/Analysis/src/HiMassTauAnalysis.cc

   rm -v $CMSSW_BASE/src/HighMassAnalysis/Analysis/interface/HiMassTauAnalysis.h
   cp -vp /uscms/home/delannoy/public/HiMassTauAnalysis.h $CMSSW_BASE/src/HighMassAnalysis/Analysis/interface/HiMassTauAnalysis.h

   rm -v $CMSSW_BASE/src/HighMassAnalysis/Analysis/BuildFile.xml
   cp -vp /uscms/home/delannoy/public/BuildFile.xml $CMSSW_BASE/src/HighMassAnalysis/Analysis/BuildFile.xml

   cp -rvp /uscms/home/delannoy/public/VBFSusy/ $CMSSW_BASE/src/HighMassAnalysis/Analysis/test/
[delannoy@cmslpc39 src]$ source update_code.csh
[delannoy@cmslpc39 src]$ scram b -j8 | tee scram_code.log

HighMassAnalysis.cc contains the new analysis code. You will find a 'Data/' and 'MC/'directory inside 'HighMassAnalysis/Analysis/test/VBFSusy/DiMuon/'. If you look inside any given specific MC or Data directory, you will find the updated 'Cut.py' file and a 'cut_crab.cfg' which specifies parameters like the dataset path, total number of events, output filename, etc.

[delannoy@cmslpc39 src]$ cd $CMSSW_BASE/src/HighMassAnalysis/Analysis/test/VBFSusy/DiMuon/MC/
[delannoy@cmslpc39 MC]$ ll -h ZmumuOSCR/DYToMuMu/*py ZmumuOSCR/DYToMuMu/*cfg
   -rw-r--r-- 1 delannoy us_cms 73K Sep  4 15:46 ZmumuOSCR/DYToMuMu/Cut.py
   -rw-r--r-- 1 delannoy us_cms 785 Sep  4 15:46 ZmumuOSCR/DYToMuMu/crab_cut.cfg

In order to submit, retrieve, and merge the jobs you can use the csh scripts on the 'MC/ZmumuOSCR/' and 'Data/' directories:

[delannoy@cmslpc39 Data]$ pwd
   /uscms/home/delannoy/VBF/CMSSW_5_3_7_patch4/src/HighMassAnalysis/Analysis/test/VBFSusy/DiMuon/Data
[delannoy@cmslpc39 Data]$ ./submitJobs.csh
...
[delannoy@cmslpc39 Data]$ ./getOutput.csh
...
[delannoy@cmslpc39 Data]$ ./addRootFiles.csh

[delannoy@cmslpc39 ZmumuOSCR]$ pwd
   /uscms/home/delannoy/VBF/CMSSW_5_3_7_patch4/src/HighMassAnalysis/Analysis/test/VBFSusy/DiMuon/MC/ZmumuOSCR
[delannoy@cmslpc39 ZmumuOSCR]$ ./submitJobs.csh
...
[delannoy@cmslpc39 ZmumuOSCR]$ ./getOutput.csh
...
[delannoy@cmslpc39 ZmumuOSCR]$ ./addRootFiles.csh

Setup CMSSW_5_3_7 environment in ACCRE

[adelanno@vanderbilt-pc1 ~]$ ssh -XY delannas@vmps09.accre.vanderbilt.edu
[delannas@vmps09 ~]$ echo $SHELL
   /bin/bash
[delannas@vmps09 VBF]$ pwd
   /home/delannas/VBF
[delannas@vmps09 VBF]$ vi ~/.bashrc 
   source /gpfs21/grid/grid-app/cmssoft/cms/cmsset_default.sh
   export CVSROOT=":pserver:anonymous@cmssw.cvs.cern.ch:/local/reps/CMSSW"
   export VO_CMS_SW_DIR="/cvmfs/cms.cern.ch/"
[delannas@vmps09 VBF]$ source ~/.bashrc 

[delannas@vmps09 VBF]$ vi setup_CMSSW_5_3_7.sh
   #https://twiki.cern.ch/twiki/bin/viewauth/CMS/SUSYJetsMETTausAnalysis2012#Installation_for_5_3_7
   export SCRAM_ARCH=slc5_amd64_gcc462
   echo $SCRAM_ARCH
   scramv1 project CMSSW CMSSW_5_3_7_patch4
   cd CMSSW_5_3_7_patch4/src
   cmsenv
[delannas@vmps09 VBF]$ source setup_CMSSW_5_3_7.sh 

[delannas@vmps09 src]$ cvs login
   Logging in to :pserver:anonymous@cmscvs.cern.ch:2401/cvs/CMSSW
   CVS password: 98passwd
[delannas@vmps09 src]$ vi cvs_pkgs.sh
   addpkg DataFormats/PatCandidates V06-05-06-05
   addpkg PhysicsTools/PatAlgos     V08-09-51
   addpkg DataFormats/StdDictionaries V00-02-14
   addpkg FWCore/GuiBrowsers V00-00-70
   addpkg RecoParticleFlow/PFProducer V15-02-06
   addpkg RecoTauTag/RecoTau V01-04-23
   addpkg RecoTauTag/Configuration V01-04-10
   addpkg CondFormats/EgammaObjects V00-04-00
   addpkg JetMETCorrections/Type1MET V04-06-09
   addpkg PhysicsTools/PatUtils V03-09-23
   addpkg CommonTools/ParticleFlow V00-03-16                              
   addpkg CommonTools/RecoAlgos V00-03-23      
   addpkg CommonTools/RecoUtils V00-00-13  
   addpkg DataFormats/ParticleFlowCandidate V15-03-03
   addpkg DataFormats/TrackReco V10-02-02      
   addpkg DataFormats/VertexReco V02-00-04      
   cvs co -d SHarper/HEEPAnalyzer UserCode/SHarper/HEEPAnalyzer
   cvs co -d HighMassAnalysis/Skimming -r for537_02182013 UserCode/AlfredoGurrola/HighMassAnalysis/Skimming
   cvs co -d HighMassAnalysis/Configuration -r for537_02182013 UserCode/AlfredoGurrola/HighMassAnalysis/Configuration
   cvs co -d HighMassAnalysis/Analysis -r forVBFSusy_07222012 UserCode/AlfredoGurrola/HighMassAnalysis/Analysis
[delannas@vmps09 src]$ source cvs_pkgs.sh | tee cvs_pkgs.log
[delannas@vmps09 src]$ scram build -c
[delannas@vmps09 src]$ scram b -j8 | tee scram.log

[delannas@vmps09 src]$ vi update_code.sh
   cd $CMSSW_BASE/src/HighMassAnalysis/Analysis/src/
   rm -v HiMassTauAnalysis.cc
   wget -S https://adelannoy.com/CMS/RA2TAU/ZJets-Estimate/Gurrola/HiMassTauAnalysis.cc
   cd $CMSSW_BASE/src/HighMassAnalysis/Analysis/interface/
   rm -v HiMassTauAnalysis.h
   wget -S https://adelannoy.com/CMS/RA2TAU/ZJets-Estimate/Gurrola/HiMassTauAnalysis.h
   cd $CMSSW_BASE/src/HighMassAnalysis/Analysis/test/
   mkdir VBF_light_stop/
   cd $CMSSW_BASE/src/HighMassAnalysis/Analysis/test/VBF_light_stop/
   wget -S https://adelannoy.com/CMS/RA2TAU/ZJets-Estimate/Gurrola/Cut.py
   wget -S https://adelannoy.com/CMS/RA2TAU/ZJets-Estimate/Gurrola/No_Cuts.py
   cd $CMSSW_BASE/src/
[delannas@vmps09 src]$ source update_code.sh
[delannas@vmps09 src]$ scram b -j8 | tee scram_code.log
[delannas@vmps09 src]$ cd HighMassAnalysis/Analysis/test/VBF_light_stop/
[delannas@vmps09 VBF_light_stop]$ cmsRun No_Cuts.py

Setup CMSSW_5_3_11 environment in ACCRE for VBF re-PatTuples

[adelanno@vanderbilt-pc1 ~]$ ssh -XY delannas@vmps09.accre.vanderbilt.edu
[delannas@vmps09 ~]$ echo $SHELL
   /bin/bash
[delannas@vmps09 VBF]$ pwd
   /home/delannas/VBF
[delannas@vmps09 VBF]$ vi ~/.bashrc 
   source /gpfs21/grid/grid-app/cmssoft/cms/cmsset_default.sh
   export CVSROOT=":pserver:anonymous@cmssw.cvs.cern.ch:/local/reps/CMSSW"
   export VO_CMS_SW_DIR="/cvmfs/cms.cern.ch/"
[delannas@vmps09 VBF]$ source ~/.bashrc 

[delannas@vmps09 VBF]$ vi setup_CMSSW_5_3_11.sh
   #https://twiki.cern.ch/twiki/bin/viewauth/CMS/SUSYJetsMETTausAnalysis2012#Installation_for_5_3_7
   export SCRAM_ARCH=slc5_amd64_gcc462
   echo "SCRAM_ARCH = " $SCRAM_ARCH
   scramv1 project CMSSW CMSSW_5_3_11
   echo "CMSSW_5_3_11"
   cd CMSSW_5_3_11/src
   cmsenv
[delannas@vmps09 VBF]$ source setup_CMSSW_5_3_11.sh 

[delannas@vmps09 src]$ cvs login
   Logging in to :pserver:anonymous@cmscvs.cern.ch:2401/cvs/CMSSW
   CVS password: 98passwd
[delannas@vmps09 src]$ vi cvs_pkgs.sh
   addpkg  CommonTools/RecoAlgos                           V00-03-23
   addpkg  CommonTools/RecoUtils                           V00-01-04
   addpkg  CondFormats/EgammaObjects                       V00-04-00
   addpkg  DPGAnalysis/SiStripTools                        V00-11-17
   addpkg  DPGAnalysis/Skims                               V01-00-11-01
   addpkg  DataFormats/METReco                             V03-03-11-01
   addpkg  DataFormats/ParticleFlowCandidate               V15-03-04-01
   addpkg  DataFormats/PatCandidates                       V06-05-06-12
   addpkg  DataFormats/StdDictionaries                     V00-02-13-01
   addpkg  DataFormats/TrackReco                           V10-02-02-01
   addpkg  DataFormats/TrackerCommon                       V00-00-08
   addpkg  DataFormats/VertexReco                          V02-00-04-01
   addpkg  FWCore/GuiBrowsers                              V00-00-70
   cvs co -d       HighMassAnalysis/Configuration -r       for537_02182013 UserCode/AlfredoGurrola/HighMassAnalysis/Configuration
   cvs co -d       HighMassAnalysis/Skimming -r            for537_02182013 UserCode/AlfredoGurrola/HighMassAnalysis/Skimming
   addpkg  JetMETCorrections/Type1MET                      V04-06-09-02
   addpkg  PhysicsTools/PatAlgos                           V08-09-62
   addpkg  PhysicsTools/PatUtils                           V03-09-28
   addpkg  RecoBTag/Configuration                          V00-07-05
   addpkg  RecoBTag/ImpactParameter                        V01-04-09-01
   addpkg  RecoBTag/SecondaryVertex                        V01-10-06
   addpkg  RecoBTag/SoftLepton                             V05-09-11
   addpkg  RecoBTau/JetTagComputer                         V02-03-02
   addpkg  RecoLocalTracker/SubCollectionProducers         V01-09-05
   addpkg  RecoMET/METAnalyzers                            V00-00-08
   addpkg  RecoMET/METFilters                              V00-00-13-01
   addpkg  RecoMET/METProducers                            V03-03-12-02
   addpkg  RecoParticleFlow/PFProducer                     V15-02-06
   addpkg  RecoTauTag/Configuration                        V01-04-13
   addpkg  RecoTauTag/RecoTau                              V01-04-25
   cvs co -d       SHarper/HEEPAnalyzer -r                 V00-09-03       UserCode/SHarper/HEEPAnalyzer
   cvs co -d       CMGTools/External -r                    V00-03-04       UserCode/CMG/CMGTools/External
[delannas@vmps09 src]$ source cvs_pkgs.sh | tee cvs_pkgs.log
[delannas@vmps09 src]$ scram build -c
[delannas@vmps09 src]$ scram b -j8 | tee scram.log

[delannas@vmps09 src]$ vi update_py_files.sh
   cd $CMSSW_BASE/src/HighMassAnalysis/Configuration/test/Data_TauTauSkim/
   rm -v hiMassTau_patProd.py
   wget -S https://adelannoy.com/CMS/RA2TAU/VBF-re-PATTuples/hiMassTau_patProd.py
   cd $CMSSW_BASE/src/HighMassAnalysis/Configuration/python/
   rm -v patTupleEventContentForHiMassTau_cff.py
   wget -S https://adelannoy.com/CMS/RA2TAU/VBF-re-PATTuples/patTupleEventContentForHiMassTau_cff.py
   cd $CMSSW_BASE/src/JetMETCorrections/Type1MET/python/
   rm -v pfMETsysShiftCorrections_cfi.py
   wget -S https://adelannoy.com/CMS/RA2TAU/VBF-re-PATTuples/pfMETsysShiftCorrections_cfi.py
   cd $CMSSW_BASE/src/
[delannas@vmps09 src]$ scram build -c
[delannas@vmps09 src]$ scram b -j8 | tee scram.log

Limits with the Higgs Combination Tool

1. Login to lxplus5.

2. Set the proper architecture:

export SCRAM_ARCH=slc5_amd64_gcc472

or:

setenv SCRAM_ARCH slc5_amd64_gcc472

depending on your shell configuration.

3. Setup the CMSSW release CMSSW_6_1_1 and source the cms environment on the src directory.

4. Get the tool from GIT:

git clone https://github.com/cms-analysis/HiggsAnalysis-CombinedLimit.git HiggsAnalysis/CombinedLimit

5. Install the tool:

   cd HiggsAnalysis/CombinedLimit
   git pull origin master
   git checkout V03-05-00
   scramv1 b clean
   scramv1 b

6. Detailed information can be found at:

https://twiki.cern.ch/twiki/bin/viewauth/CMS/SWGuideHiggsAnalysisCombinedLimit

7. Please root the simple example for a counting experiment, to verify that the tool works:

https://twiki.cern.ch/twiki/bin/viewauth/CMS/SWGuideHiggsAnalysisCombinedLimit#A_simple_counting_experiment

In order to do that, please put this information on a text file, e.g "limit_card_test.txt":

#===================================================================
# Simple counting experiment, with one signal and a few background processes 
# Simplified version of the 35/pb H->WW analysis for mH = 160 GeV

imax 1  number of channels
jmax 3  number of backgrounds
kmax 5  number of nuisance parameters (sources of systematic uncertainties)

# we have just one channel, in which we observe 0 events
bin 1
observation 0

bin              1     1     1     1
process         ggH  qqWW  ggWW  others
process          0     1     2     3
rate           1.47  0.63  0.06  0.22

lumi    lnN    1.11    -   1.11    -    lumi affects both signal and gg->WW (mc-driven). lnN = lognormal
xs_ggH  lnN    1.16    -     -     -    gg->H cross section + signal efficiency + other minor ones.
WW_norm gmN 4    -   0.16    -     -    WW estimate of 0.64 comes from sidebands: 4 events in sideband times 0.16 (=> ~50% statistical uncertainty)
xs_ggWW lnN      -     -   1.50    -    50% uncertainty on gg->WW cross section
bg_others lnN    -     -     -   1.30   30% uncertainty on the rest of the backgrounds

#===================================================================

run the tool as:

combine -M Asymptotic limit_card_test.txt

You will get on the screen window:

//===================================================================

>>> including systematics
>>> method used to compute upper limit is Asymptotic
>>> random number generator seed is 123456
Warning: Bin 1 starts with a digit. Will call it 'bin1' but this may break shapes.
Warning: Bin 1 starts with a digit. Will call it 'bin1' but this may break shapes.
Warning: Bin 1 starts with a digit. Will call it 'bin1' but this may break shapes.
Warning: Bin 1 starts with a digit. Will call it 'bin1' but this may break shapes.
Warning: Bin 1 starts with a digit. Will call it 'bin1' but this may break shapes.
Computing limit starting from observation
Will compute both limit(s) using minimizer Minuit2 with strategy 0 and tolerance 0.01
Median for expected limits: 2.32031
Sigma  for expected limits: 1.18385
Restricting r to positive values.

Make global fit of real data
NLL at global minimum of data: 6.18036 (r = 2.16544e-07)

Make global fit of asimov data
NLL at global minimum of asimov: 6.15916 (r = 3.41342e-06)
At r = 1.200000: q_mu = 3.42894 q_A  = 1.55220 CLsb = 0.02280 CLb  = 0.22567 CLs  = 0.10104
At r = 2.400000: q_mu = 6.68355 q_A  = 4.00453 CLsb = 0.00379 CLb  = 0.25163 CLs  = 0.01505
At r = 1.200000: q_mu = 3.42894 q_A  = 1.55220 CLsb = 0.02280 CLb  = 0.22567 CLs  = 0.10104
At r = 1.794661: q_mu = 5.06194 q_A  = 2.73054 CLsb = 0.00919 CLb  = 0.24027 CLs  = 0.03825
At r = 1.544518: q_mu = 4.38002 q_A  = 2.22248 CLsb = 0.01340 CLb  = 0.23465 CLs  = 0.05711
At r = 1.660897: q_mu = 4.69817 q_A  = 2.45705 CLsb = 0.01123 CLb  = 0.23734 CLs  = 0.04733
At r = 1.610419: q_mu = 4.56037 q_A  = 2.35489 CLsb = 0.01212 CLb  = 0.23619 CLs  = 0.05133
At r = 1.633600: q_mu = 4.62369 q_A  = 2.40173 CLsb = 0.01171 CLb  = 0.23673 CLs  = 0.04945
At r = 1.623486: q_mu = 4.59607 q_A  = 2.38128 CLsb = 0.01189 CLb  = 0.23649 CLs  = 0.05026
At r = 1.628119: q_mu = 4.60872 q_A  = 2.39064 CLsb = 0.01180 CLb  = 0.23660 CLs  = 0.04989

 -- Asymptotic -- 
Observed Limit: r < 1.6281
Expected  2.5%: r < 0.9653
Expected 16.0%: r < 1.4364
Expected 50.0%: r < 2.3203
Expected 84.0%: r < 3.9666
Expected 97.5%: r < 6.5972

//===================================================================

the parameter "r" is the signal strength "number of signal events / number of expected signal events". The r value for "Expected 50%" is the mean expected limit on SignalEvents/ExpectedSignalEvents. To get the limit on the cross-section, multiply the r value for "Expected 50%" by the theoretical cross-section for that point.

8. If there are multiple datacards to be combined into one limit, first produce a combined datacard:

combineCards.py card1.txt card2.txt ... > combined.txt

then run as usual:

combine -M Asymptotic combined.txt

9. To calculate the significance:

combine -M ProfileLikelihood --signif combined.txt
Topic attachments
I Attachment History Action Size Date Who Comment
Unknown file formatcfg gluinoPairToNeutralinoToHiggs.cfg r1 manage 2.1 K 2012-06-23 - 19:48 AlfredoGurrola  
Unknown file formatma5 plots.ma5 r1 manage 5.9 K 2017-06-01 - 17:45 AlfredoGurrola  
Unknown file formatcfg sbottomPair.cfg r1 manage 1.9 K 2012-06-19 - 18:29 AlfredoGurrola  
Unknown file formatcfg stauPair.cfg r1 manage 2.1 K 2012-06-19 - 18:29 AlfredoGurrola  
Unknown file formatcfg stauSnu.cfg r1 manage 1.9 K 2012-06-23 - 21:47 AlfredoGurrola  
Unknown file formatcfg stopPair.cfg r1 manage 1.9 K 2012-06-19 - 18:29 AlfredoGurrola  
Edit | Attach | Watch | Print version | History: r62 < r61 < r60 < r59 < r58 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r62 - 2017-09-21 - AndresDelannoy
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback