MelAnalysis tutorial

Installation

create a Workdir directory

create a setup.sh file inside with the following lines:

export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase

source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh

lsetup rucio

lsetup panda

rcSetup Base,2.4.20

rc find_packages

rc compile

download the MelAnalysis package from svn:

svn co svn+ssh://svn.cern.ch/reps/atlasinst/Institutes/Bern/SusyAnalysis/MelAnalysis/trunk

or

rc checkout_pkg atlasinst/Institutes/Bern/SusyAnalysis/MelAnalysis/trunk/

source the setup.sh file

Description

this package is built conformly to the other analysis packages of SUSYAnalysis it contains 5 folders and one file

  1. scripts: the python scripts to run the code are stored together with a folder, called SHandlers, that contains the soft links to the files on the grid we want to run on. To add links to the list, it is just necessary to run the script MakeSampleHandler :

    python MakeSampleHandler.py --inputDS <name of the container>

    take in mind that the soft links are created linking directly the files contained inside the container at the time you create the link, so, if a container is updated on the grid, you will need to remove the link to the dataset from the SHandler directory and to recreate the link. There are two python run scripts: Run.py and
    GridRun .py, the first is to run in local, the second to run on the grid.
  2. data: the ilumicalcfile.root, the GRL file, the prw.root and the configuration file that is used by SUSYTools (containing the object definition cuts etc.) is saved in the directory data. Those files might be out-of-date. Moreover, the GRL is not used anymore in the code in such a way to have the possibility of applying it directly in the analysis code that runs locally. Please check that they fulfil the last recommended requirements.
  3. cmt: the Makefile of the package is in cmt, the PACKAGE_DEP inside the file requires all the dependences from the SUSYAnalysis packages
  4. Root: here the .cxx of the main package classes are saved as well as the LinkDef .h file that in association with the makefile allows the compilation of the package
  5. MelAnalysis : here the headers of the main package classes are saved
  6. README is the not really updated readme file of the package :p

Run the code

python run scripts can be used like this:

python Run.py/GridRun.py --options

the options are the following:

  • --submitDir: name of the directory to store the output
  • --inputDS: input dataset container name --driver: choices=("direct", "prooflite", "grid") to choose if you want to run locally with direct driver, locally in parallel or on the grid if the inputDS is on the grid, you can choose the run locally as well (FAX)
  • --nevents: number of events to process for all the datasets
  • --skip-events: skip the first n events
  • --overwrite: overwrite previous output folder
  • -s/--syst: compute systematic variations -t/--truth: add truth information
  • -m/--onlytruth: only truth information (dedicated to custom skimming of the n-tuples)
  • -l/--onelepton: one lepton skim
  • -d/--debug:activate DEBUG mode (useful in local)
  • -o/--optimization: adds optimization variables
  • -n/--neutrinification: adds the neutrinification of one lepton
  • --leptonization: adds leptonization
  • --isData: to use in case you ere running on data
  • --isAtlfast: to be used if you are running over ATLFAST datasets
  • --processID: to be used if you are running over signals to retreive the correct cross section

Classes: MyxAODAnalysis and OutTree

The output of the MelAnalysis code is an ntuple containig all the informations useful for the analysis. The n-tuples stores an histogram containing the weighted number of events (an n-tuple is generated for each file, the total weight is referred to that particular file) and a tree. For each systematic variation, and for the nominal case, we have a dedicated branch in the tree for all the interesting variables.

The filling of all the branches is obtained through the call to the OutTree class. In the MelAnalysis initialization we define a mem_leaker object of the Outtree type and we set all the properties that will remain unchanged for all the systematics. The mem_leaker will create the branches of the tree with a dedicated name depending on the systematic list we are running on. In the execute, for each event, a loop calls the mem_leaker->process function for each systematic (and for the nominal). At each step we pass, with a structure of the type myAnalysisCollections (called myCollections), all the informations systematic dependent to the mem_leaker. Among these also the object containers as depending on the systematic variation considered, the number or the characteristics of the leptons/photons/jets may change. The function "process" develops all the cut and count preselection (with the required skimming) and returns an integer different from 0 if the event under study passed the selection. In MelAnalysis we have a counter that adds all the mem_leaker results for each systematic variation. If the event passes the requirements for at least one of the systematic variations, the branches values for that event are saved.

List of ntuple productions:

March-02-2016: ProductioN31 (p24<249*,rcSetup SUSY,2.3.41, SUSYTools-00-07-29, MuonEfficiencyCorrections -03-02-05, LooseLLH for baseline electrons)

-- MariaElenaStramaglia - 2016-03-02

Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r2 - 2016-10-14 - AntonelloMiucci
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback