ElectroweakBosons at UMass Titan Cluster

Setting-up the code

The framework is built from a set of base packages and scripts that are already compiled and available from a common location (/data/mbellomo/ElectroweakBosons for SLC5 and ~mbellomo/shared/ElectroweakBosons for SLC6).

To build your local environment execute the first (second) line for SLC5 (SLC6):

source /data/mbellomo/localSetupElectroweakBosons.sh
source ~mbellomo/shared/localSetupElectroweakBosons.sh

This will create a directory called "ElectroweakBosons" and link the base packages/scripts.

From "ElectroweakBosons" directory, to setup do :

source scripts/setup_umass.sh 

Note that this assumes you have "setupATLAS" command available in your path. If not, see UMassCluster#Account_Setup_to_be_done_only_on

The above setup with setup_umass needs to be done every time your create a new terminal session.

Create your own analysis package and tools

The framework is organised in modular "packages". To create your own package, where you can develop your analysis, do:

smultiframe_new_package.sh AnalysisGoal

where "AnalysisGoal" should be a sensible name related to the analysis project you have in mind.

Please note that this name has to be unique within the UMass setup so please check with others, for instance you can check that a library with the same name is not already available in "/data/mbellomo/ElectroweakBosons/SFrame/lib". For instance a test package i've made called "AnalysisTest" is compiled into the library "libAnalysisTest.so".

To create the skeleton of an analysis tool do:

cd AnalysisGoal
smultiframe_create_tool.py -n MyTool

This will create a skeleton class, called "MyTool" with a skeleton configuration file called "MyTool_config.xml" (check under AnalysisGoal/config).

From "AnalysisGoal" directory, to compile the code do:

make

Running locally

The configuration file holds the information about input files and the tool configuration. Have a look at MyTool_config.xml. A sensible input file can be this one:

"root://titan.physics.umass.edu//atlas/common/D3PD/mc12_8TeV/EXOTICS_D3PD/mc12_8TeV.182302.MadGraphPythia8_AU2CTEQ6L1_RSG_hh_bbbb_m1000.merge.NTUP_COMMON.e2227_a188_a219_r3549_p1575_tid01341771_00/NTUP_COMMON.01341771._000001.root.3"

Add it to MyTool_config.xml in place of "ADD_SENSIBLE_PATH_TO_INPUT_FILE". Now you are ready to run!

To run locally do:

sframe_main config/MyTool_config.xml

An output file will be produced in the same directory from where you type this command.

Check the content and look into AnalysisGoal/src/MyTool.cxx to understand how it works. Each analysis tool derives from AnalysisBase/AnalysisToolBase class, which provides the basic functions (i.e. book and fill for histograms).

Running on Titan cluster with PROOF

To run with PROOF on Titan cluster we make use of the ProofOnDemand utility (PoD). For more details see http://pod.gsi.de/

PoD allows to create a PROOF cluster on demand tailored on specific analysis needs.

To get this up and running you need to start a PoD server locally, then activate a given number of workers on the cluster and finally, when they are ready, run your job over them. Everything is much simpler than what it seems, given that PoD is already supported by ATLAS setup and the framework, being based on SFrame, has a built-in support for PROOF.

Prepare the PROOF environment (needed only once!)

Start and stop the PoD server to get the PoD configuration directory to be created with:

pod-server start
pod-server stop

These commands can be given from any location. The directory $HOME/.PoD is created. Add to this directory a file called "user_xpd.cf" containing these lines:

xpd.rootd allow
xpd.putenv ROOTPROOFLIBDIR=/data/mbellomo/ElectroweakBosons/SFrame/lib

The ROOTPROOFLIBDIR is the location where compiled libraries are located and with this variable set PROOF will load the pre-compiled libraries instead of re-compiling the code on each worker.

Start the PoD server (every time you need to create a PROOF cluster and a PoD server is not running)

To check if a PoD server is running do:

pod-server status

To start-up the server do:

pod-server start

Now "pod-server status" should give you something like:

XPROOFD [639] port: 21001
PoD agent [663] port: 22001
PROOF connection string: mbellomo@titan.physics.umass.edu:21001

Activate the workers of the PROOF cluster (every time you need to create a PROOF cluster)

Workers are activated sending jobs to the PBS batch system of the cluster with a dedicated PoD command:

pod-submit -r pbs -n 5 -q long

This send to "pbs" system the request for "5" workers in the "long" queue.

Note that the workers will be kept alive at most for the time allowed by the queue, however if the PROOF cluster is not used for 30 minutes then they are shutdown automatically to avoid keeping resources busy without need. If they are not available anymore they can be re-activated with the above command. Multiple calls in the same server session are possible.

Check how many workers are ready with:

pod-info -n

This returns the number of available workers to be used by your job on the PROOF cluster. Note that when "sframe_main" is executed the number of workers used is the one currently available. If other workers become available later they are not used, so check how many workers are ready before starting the job.

Run the analysis on the PROOF cluster

To run edit the xml configuration file (i.e. MyTool_config.xml) and change "LOCAL" to "PROOF".

Run as before with:

sframe_main config/MyTool_config.xml

The job is sent to the N available workers, the input is split across them and the output merged together and sent back to the local directory.

To shutdown the PROOF cluster

To shutdown the workers do:

pod-server stop

The workers are automatically shutdown after 30 min. of inactivity.

More info

An old tutorial is available locally in the directory ElectroweakBosons/doc/tutorial.pdf. An updated version of the user guide is available from the package directory ElectroweakBosons/trunk/doc/userguide.pdf. See latest version at SVN link.

Edit | Attach | Watch | Print version | History: r9 < r8 < r7 < r6 < r5 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r9 - 2014-10-07 - StephaneWillocq
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback