DPCC and how to use it for ATLAS analysis work

Logging in
Logging in to DPCC:

ssh grid.dpcc.uta.edu

or any of the other gateway nodes. master and atlas are other entry points, but should not be used for running jobs - catalog is mainly for xrootd transactions from outside the cluster.

Setting up your account

Some useful items to have in your acccount

  • A .bashrc file with the following entries added:
export PATH=/data73/atlas/python/2.5.2/bin:/data71/atlas/root/5.20/bin:$PATH
export LD_LIBRARY_PATH=/data73/atlas/python/2.5.2/lib:/data71/atlas/root/5.20/lib/root:$LD_LIBRARY_PATH
export PYTHONPATH=/data71/atlas/root/5.20/lib/root:$PYTHONPATH
export LCG_GFAL_INFOSYS=lcg-bdii.cern.ch:2170

Setting up ssh keys between the nodes is also convenient.

Using Athena
Now that your account is ready, you can enable Athena and test that it works. Follow these instructions when you want to set up Athena after logging in.

These instructions are for Athena release 14.2.25 -- you can easily adapt them, though.

Create a directory in your home area, as follows:

mkdir -p ~/Athena/14.2.25 cd ~/Athena/14.2.25 mkdir -p run

cat << EOF > setup.sh
source /data71/atlas/Releases/${RELEASE}/cmtsite/setup.sh -tag=${RELEASE},runtime
export CMTPATH=`pwd`:$CMTPATH
export TestArea=`pwd`
source /data71/atlas/Releases/${RELEASE}/AtlasProduction/${RELEASE}/AtlasProductionRunTime/cmt/setup.sh

This created a directory with the Athena release number as its name, created a run directory within the release directory, and created a setup script. The setup script is completely portable -- you can make a directory anywhere, for any release newer than 13, and copy this script into it to set up for Athena.

Run the script as follows:

source setup.sh

and when it completes, you can run a test job:

cd run; get_files HelloWorldOptions.py
athena HelloWorldOptions.py

You should see a lot of text scrolling down your screen, ending with:

HelloAlg            FATAL A FATAL error message
HelloAlg             INFO Let the tool MyPublicHelloTool say something:
ToolSvc.HelloTool    INFO my message to the world: hi there!
HelloAlg             INFO Let the tool MyPrivateHelloTool say something:
HelloAlg.HelloTool   INFO my message to the world: secret!
AthenaEventLoopMgr   INFO   ===>>>  end of event 9    <<<===
HelloAlg             INFO endRun()
HistorySvc           INFO Service finalised successfully
HelloAlg             INFO finalize()
EventSelector        INFO finalize
StoreGateSvc         INFO Finalizing StoreGateSvc - package version StoreGate-02-20-04
DetectorStore        INFO Finalizing DetectorStore - package version StoreGate-02-20-04
ToolSvc.finalize()   INFO Removing all tools created by ToolSvc
StatusCodeSvc        INFO initialize
ApplicationMgr       INFO Application Manager Finalized successfully
ApplicationMgr       INFO Application Manager Terminated successfully
Py:Athena            INFO leaving with code 0: "successful run"

This indicates that Athena is working fine, and ready for you to use.

Setting up SPyROOT

(This will not work if Athena has already been initialized.)

Simply execute

source /Atlas/SPyRoot/setup_dpcc.sh

and type


to begin using it.

If you wish to set up SPyRoot to work with ARA, please modify the command to

source /Atlas/SPyRoot/setup_ara_dpcc.sh

Setting up ROOT and using it for class (Physics 5391, Fall 2008)

Make sure your .bashrc is set up as mentioned above. This will tell your command line where ROOT is, and how to launch it.

At the command line, type root and press enter.

We'll now load ROOT files in a way that ROOT knows how to handle a number of files at the same time.

At the ROOT prompt:

TChain *c1 = new TChain("Truth0")

This creates a TChain, which takes a set of ROOT files, extracts the Truth0 tree (if it exists) from each file, and strings all the Truth0 trees together so they can be processed as a unit. To be precise, c1 is a pointer to the TChain we have created (that's why it's declared as TChain *c1).

To add files to the TChain, we do the following:


This adds all files in the directory /data73/atlas/data/Class/J3/AANT0/* on DPCC.

We can also chain the same files in another TChain that picks up the FullRec0 TTrees:

TChain *c2 = new TChain("FullRec0") 

With these steps done, you can check the contents of the tree with a call like:


or look at the values of a variable:

c1->Scan("El_phi","El_phi > 0")

You can also open files individually:

TFile *f1 = new TFile("/data73/atlas/data/Class/J3/AANT0/user08.AldenStradling.mc08.005012.J3_pythia_jetjet.recon.DPD.e323_s400_d99_r474.3.AANT0._00001.root")

You can use the trees (Truth0 and FullRec0) contained in the file by name (automatically thanks to ROOT), or you can ask for the trees individually, as follows:

TTree *t1 = (TTree *) f1->Get("Truth0") 
TTree *t2 = (TTree *) f1->Get("FullRec0") 

DPCC Node List


  • master.local
  • grid.local
  • atlas.local
  • catalog.local

Compute Nodes

Can be accessed via c0-1, for example

  • compute-0-0.local
  • compute-0-1.local
  • compute-0-2.local
  • compute-0-3.local
  • compute-0-4.local
  • compute-0-5.local
  • compute-0-6.local
  • compute-0-7.local
  • compute-0-8.local
  • compute-0-9.local
  • compute-0-10.local
  • compute-0-11.local
  • compute-0-12.local
  • compute-0-13.local
  • compute-0-14.local
  • compute-0-15.local
  • compute-0-16.local
  • compute-0-17.local
  • compute-0-18.local
  • compute-0-20.local
  • compute-0-21.local
  • compute-0-22.local
  • compute-0-23.local
  • compute-0-24.local
  • compute-0-25.local
  • compute-0-26.local
  • compute-0-27.local
  • compute-0-28.local
  • compute-0-29.local
  • compute-0-30.local
  • compute-0-31.local
  • compute-0-32.local
  • compute-0-33.local
  • compute-0-34.local
  • compute-0-36.local
  • compute-0-37.local
  • compute-0-38.local
  • compute-0-39.local
  • compute-0-40.local
  • compute-0-41.local
  • compute-0-42.local
  • compute-0-43.local
  • compute-0-44.local
  • compute-0-45.local
  • compute-0-46.local
  • compute-0-47.local
  • compute-0-48.local
  • compute-0-49.local
  • compute-0-50.local
  • compute-0-51.local
  • compute-0-52.local
  • compute-0-53.local
  • compute-0-54.local
  • compute-0-55.local
  • compute-0-56.local
  • compute-0-57.local
  • compute-0-58.local
  • compute-0-59.local
  • compute-0-60.local
  • compute-0-61.local
  • compute-0-62.local
  • compute-0-63.local
  • compute-0-64.local
  • compute-0-65.local
  • compute-0-66.local
  • compute-0-67.local
  • compute-0-68.local
  • compute-0-69.local
  • compute-0-70.local
  • compute-0-71.local
  • compute-0-72.local
  • compute-0-73.local
  • compute-0-74.local
  • compute-0-75.local
  • compute-0-76.local
  • compute-0-77.local
  • compute-0-78.local
  • compute-0-79.local
  • compute-1-1.local
  • compute-1-3.local

Disk Servers

  • raid1.local
  • raid3.local
  • raid4.local
  • raid5.local
  • raid6.local
  • raid7.local
  • raid8.local
  • raid9.local
  • raid10.local
  • xrdb.local

-- AldenStradling - 26 May 2008

Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r6 - 2009-03-12 - AldenStradling
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback