CoEPPNtupGen is a c++ package which incorporates all the necessary ATLAS tools to produce derived ntuples that can be easily analysed.


  • CoEPPNtupGen - header files
  • config - base xml configuration stubbs
  • misc - miscellaneous files
  • root - configuration rootfiles (eg histos for pileup reweighting)
  • scripts - command line tools (added to path in
  • share - python scripts designed to be modified for custom use (help in setting up sample configuration)
  • src - source files

Latest News


Follow these steps to install:
  1. Follow the instructions to setup the CoEPP suite here
  2. Setup athena (the release is not critical, although 17.0.2 is tested. In principle the package can be compiled and run against any recent version of ROOT compiled against python, without an athena install) (if you havent setup athena before look here)
  3. In the CoEPPDir checkout the LATEST tag of CoEPPNtupGen (you should check which is latest in SVN):
    • svn co svn+ssh:// CoEPPNtupGen
  4. Check out dependancies and build the package. This can all be done by issuing these commands:

Note: It is highly recommended that you setup ssh keys to (see here) before issuing the make command to avoid putting in your password 2 times for every checkout. Also, note that the checkout WILL NOT WORK if your local username and lxplus username are different and you have not configured this in .ssh/config. This file should have an entry something like this:

Host svn
User wdavey
GSSAPIAuthentication yes
GSSAPIDelegateCredentials yes
Protocol 2
ForwardX11 no
where User is your lxplus username.


The code is run with the executable:
./runAnalysis [options] XML_CONFIG
Where XML_CONFIG is a mandatory xml config file. Please use ./runAnalysis --help for a list of command line options.


As an example we will run over a skimmed D3PD. For the example we will use skims from release 17 D3PDs with the new 'tau' TTree (as opposed to the old 'tauPerf' TTree). Either use the skim you made in CoEPPGridTools#Skimming_Example, or use dq2-get to retrieve this one user.wdavey.Skim.Ztautau.r17default.TESSkim_v3/ from the grid (dq2 instructions here).

Single Local File Example

To run over a single file execute this command:
./runAnalysis --isMC --files <file> run-example.xml
This should produce an ntuple called CoEPPNtup.example.root.

Note: < file > can be a comma separated list of input files. This is typically used for grid running, however there are more convenient ways to configure running over multiple files locally.

Multiple Local File Example

The easiest way to run over multiple files locally is to generate an xml file with the list of input files. To do this, first make sure you have setup CoEPPGridTools. Then issue the following command:
genInputXML -o input.files.xml -c -t tau PATH/TO/FILES/*.root*
where the -c and -t tau options will engage checking of the tau TTree in the input files. Note PATH/TO/FILES/*.root may look something like this: /lustre/user/wedavey/data/Skims/TestTESSkims/user.wdavey.Skim.Ztautau.r17default.TESSkim_v3.111120200140/*.root*

Now you must include this list of files in the XML_CONFIG file. To do this, first you must import this file into the main config. To do this make sure the top of your run-example.xml contains a line:

<!ENTITY input_files SYSTEM "input.files.xml"> 
that points to this new file.

Then you need to include this in the main config. Do this by replacing:

<In FileName="PATH/TO/FILE/file.root" Lumi="1."/>

Now simply execute the new config like this:

./runAnalysis run-example.xml

Full Local Analysis

It would be possible to manually create input file configs for each of the datasets you wish to run over. However, there are a couple of scripts that make this task a lot easier. If you want to try this whole example, you can download the full TES analysis skim user.wdavey.Skim.r17default.TESSkim_v2/ (~80GB).

Generating Input File config with share/ (unfortunately this script is a little more complicated since there are both 'tau' and 'tauPerf' D3PDs, but this should be simplified in the future).

To configure share/ you should:

  • set indir to the path where you downloaded the skims.
  • make an output dir and set it to outdir (here its just called runTES).
  • hash out or incude all the datasets you want to use.
Then from the CoEPPNtupGen top dir execute the script:
python share/
This should generate all the input files in the outdir.

Generating Run XML with share/

Again, this script is complicated by the fact that at the moment we have ZtautauAlpgen and Embedded samples with the old tauPerf tree. So if you prefer, you can simplify it by removing these. The most important things to configure are:

  • indir - should be the dir where your input file configs are
  • outdir - this is where your output ntuples will be saved (fine to have it the same as the config dir)
  • runtag - the run scripts will have the form run--.xml (useful if you have multiple configurations you are running)
  • Then comment / include the datasets so they match your input dataset configs.
Again, from the CoEPPNtupGen top dir execute the script:
python share/
This should generate all the run config files.

Executing Run XML:

If you have access to a PBS queue, you can easily launch all the jobs at once. First edit the batch script and change the following lines to perform the appropriate athena configuration on your platform:

## Athena Config 
source .bashrc
source setAth
Then launch the jobs like this:
launchPBS run.TES.*.xml


-- WillDavey - 20-Nov-2011

Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r2 - 2011-11-20 - WillDavey
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback