Offline Simulation Software CMSSW_4_2_4 - getting started

This document will explain how to use TOTEM offline software.

IMPORTANT Documentation how to run CMSSW 3_1_1 can be found here

IMPORTANT CMSSW_4_2_4 is not compatible with SLC6. After doing ssh to lxplus, you will be by default redirected to nodes running SLC6. In order to use SLC5 on lxplus, login to

For experts

If you know what to do, use following instruction (if not, read whole page):

export STAGE_HOST=castorpublic
export STAGE_SVCCLASS=default
export SCRAM_ARCH=slc5_amd64_gcc434
source /afs/
scram project CMSSW CMSSW_4_2_4
svn co svn+ssh:// CMSSW_4_2_4/src/
cd CMSSW_4_2_4
bash src/
scram setup castor
scram tool info castor
cd src
eval `scram runtime -sh`
scram b -j 15

If an error occurs, run following command until everything is OK (to exit, press Ctrl-C).



scram b -j 4
scram b



Let us login via ssh to some remote machine. It might be lxplus, but we will use our own PC (4 processors, most of the time stands free).

ssh -X

Mine account on pctotem31 (grzanka) differs from my normal CERN account (lgrzanka), so I will request kerberos ticket, in order to have access to distributed AFS disk space.

kinit lgrzanka


Let us create some temporary directory

mkdir -p tmp/offlineSWTest
cd tmp/offlineSWTest

When working on SLC4 nodes, this is necessary:

export SCRAM_ARCH=slc5_amd64_gcc434

We have to load default set of environment variables which are necessary for work with CMSSW. Script containing these variables is stored in the directory where CMSSW framework and all necessary libraries are installed. There are three places where you can find CMSSW_4_2_4 framework installed:

Machine Directory Comments
lxplus, lxbatch, machines with AFS access /afs/ CMS installation
lxplus, lxbatch, machines with AFS access /afs/ TOTEM installation
lxplus, lxbatch, machines with AFS access /afs/ TOTEM installation /usr/local/cmssw_slc5/ TOTEM local installation

Let us use default CMS installation, type in bash:

source /afs/

Note difference for this command vs CMSSW 3_1_1: previously it was source /afs/ (now sw is missing) !

Now we have access to scram command. We can check that by typing:

scram help


We will compile our software from sources.

First we initialize CMSSW project area:

scram project CMSSW CMSSW_4_2_4

After that your workspace called CMSSW_4_2_4 is created, this is the default name for the working directory. It is also possible to specify another directory as your workspace by:

scram project -n <your_given_name> CMSSW CMSSW_4_2_4

We can see that after that command following directory structure emerged:

[pctotem31] /home/grzanka/tmp/offlineSWTest > ls CMSSW_4_2_4/
bin  cfipython   config   doc  include  lib  logs  objs  python  src  test  tmp

In CMSSW_3_1_1 it was necessary (see this page) to use special version of Geant4 due to bug in the library. Now we can safely use default version of Geant4.

Our sources shall go to CMSSW_4_2_4/src subdirectory, so let us first check out the code from Totem SVN to that directory. If you have source code already stored in another place, do no to try to make any links from CMSSW_4_2_4/src there, as some strange unexpected problems might appear.

If you are using the same user as on the lxplus machine, you can safely omit the username in SVN path (use simply svn+ssh://

[pctotem31] /home/grzanka/tmp/offlineSWTest > svn co svn+ssh:// CMSSW_4_2_4/src/
[pctotem31] /home/grzanka/tmp/offlineSWTest > ls CMSSW_4_2_4/src/
Configuration  EventFilter  L1TriggerTotem  SimDataFormats  SimTotem       TotemBackground     TotemDQM        TotemT1T2Validation
DataFormats    Geometry     RecoTotemRP     SimG4CMS       TotemAlignment  TotemCondFormats     TotemRawData        Visualisation
Documentation  IOMC       RecoTotemT1T2   SimG4Core       TotemAnalysis   TotemDatabaseService  TotemRPValidation
[pctotem31] /home/grzanka/tmp/offlineSWTest > cd CMSSW_4_2_4/

Some example configuration files are in CMSSW_4_2_4/src/Configuration/TotemStandardSequences/test/

An old, unsupported CASTOR client is included in CMSSW_4_2_4, so in order to use CASTOR with CMSSW, you should edit file: config/toolbox/slc5_amd64_gcc434/tools/selected/castor.xml in such way that it looks like that:

<tool name="castor" version="">
  <lib name="shift"/>
  <lib name="castorrfio"/>
  <lib name="castorclient"/>
  <lib name="castorcommon"/>
    <environment name="CASTOR_BASE"
    <environment name="INCLUDE" default="$CASTOR_BASE/include"/>
    <environment name="LIBDIR" default="$CASTOR_BASE/lib"/>


scram setup castor
scram tool info castor
cd src

Now let us start compilation. Our machine has 4 processors, so we can run compilation in 8 threads to speedup this process. Still this will take some time (~10-15 minutes).

scram build (or scram b)
[ scram b -j 8 is another option to run build which will use 8 threads to compile the code. Use with caution, as this could slow down machine for other users. ]

Compilation on SLC4 was not yet fully tested. Some incompatibilities related to different version of GLIBC libraries can be found.

It might happen that after compilation following (or similar) error appear:

2b51ae2e8000-2b51ae2ee000 rw-p 2b51ae2e8000 00:00 0 
2b51ae2ee000-2b51af400000 r-xp 00000000 00:19 1495445828                 /afs/
2b51af400000-2b51af603000 rw-p 01112000 00:19 1495445828                 /afs/
2b51af603000-2b51af908000 rw-p 2b51af603000 00:00 0 
2b51af908000-2b51af910000 r-xp 00000000 00:19 1495445830                 /afs/
2b51af910000-2b51af911000 rw-p 00007000 00:19 1495445830                 /afs/
2b51af911000-2b51af951000 r-xp 00000000 00:19 1495445832                 /afs/
2b51af/bin/sh: line 10: 12477 Aborted                 EdmPluginRefresh lib/slc5_amd64_gcc434
gmake[1]: *** [lib/slc5_amd64_gcc434/.edmplugincache] Error 134
gmake[1]: *** Waiting for unfinished jobs....
gmake[1]: Leaving directory `/home/grzanka/tmp/offlineSWTest/CMSSW_4_2_4'
gmake: *** [src] Error 2

This error is related to some problem in EdmPluginRefresh code. It might be that scram process is hanging, kill it with Ctrl-C before doing next steps. First be sure to be in some subdirectory of CMSSW_4_2_4. Then setup runtime environment. :

eval `scram runtime -sh`

Now we go back to ~/tmp/offlineSWTest directory (now you should be able to run cmsRun and root commands - which will be useful later):

cd ~/tmp/offlineSWTest

In case there was an error after compilation, execute following commands (several times if necessary):

scram b


Let us take example configuration file:


Check number of events to generate:

process.maxEvents = cms.untracked.PSet(
    input = cms.untracked.int32(10)

Check output file:

exec 'process.' + str(process.outpath) + '.fileName = cms.untracked.string("file:prodRPelasticBeta90Energy7TeV.root")'

Check sequence of modules to execute:

process.p1 = cms.Path(process.generator*

Now we can run that file:

cmsRun CMSSW_4_2_4/src/Configuration/TotemStandardSequences/test/RP/

If everything went OK, then we can find output file:

[pctotem31] /home/grzanka/tmp/offlineSWTest > ls -l *root
-rw-rw-r-- 1 grzanka grzanka 161956 Sep  8 10:11 prodRPelasticBeta90Energy7TeV.root


Now we can generate some plots.

We can start root session, to see plots:

[pctotem31] /home/grzanka/tmp/offlineSWTest > root -l
root [0] TBrowser t

-- LeszekGrzanka - 08-Sep-2011

Edit | Attach | Watch | Print version | History: r12 < r11 < r10 < r9 < r8 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r12 - 2013-07-23 - LeszekGrzanka
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    TOTEM All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback