LHCbDirac client distribution and environment

This document has been updated following a meeting held on December 5th with Ricardo, Hubert, Joel, Marco Cl and Philippe.


  • First draft (PhC): 18-Nov-2008
  • Updated draft (PhC): 4-Dec-2008
  • Proposal including discussions of Dec 5th (PhC): 8-Dec-2008
  • Added user guide on how to take benefit from it (PhC): 28-Jan-2009
  • Update user guide (PhC): 09-Mar-2009
  • add the instruction to create the DIRAC tarball (jc): 12-Mar-2009


The aim is to allow the DIRAC3 distribution mechanism to share the tools used by the LHCb applications, as well as benefit from the distribution of LCG clients in the LCG-AA for controlling the version and possibly allow making tests with more recent (e.g bugfix) versions. The environment set up for the user should enable him/her to use most middleware commands (mainly Data Management) as well as their python bindings.

The Core Software tools are:

  • install_project : applies dependencies of CMT projects for downloading and installing packaged tarballs.
  • SetupProject : uses CMT for setting the run-time environment of the project.
  • getpack : allows to check out tagged versions of head of individual packages in a user area (created with SetupProject or a dedicated script)

DIRAC packages

DIRAC is already split into packages: Framework, WMS, DMS etc... with well defined dependencies. The packages share a common base directory DIRAC. The full list is here:
  • AccountingSystem
  • BookkeepingSystem
  • ConfigurationSystem
  • DataManagementSystem
  • FrameworkSystem
  • LHCbSystem
  • LoggingSystem
  • MonitoringSystem
  • ProductionManagementSystem
  • RequestManagementSystem
  • StagerSystem
  • WorkloadManagementSystem

Inside each package, one can find standard directories: Agent, Client, Service and scripts. This overall structure is very similar to that of LHCb Gaudi applications and thus one can adapt easily the existing tools.

Also similarly to LHCb applications, there are mainly one or two developers in charge of each package. The present proposal is also trying to address the problem of testing software changes in one package, possibly against changes in other packages in a consistent way. This is understood not to be the current model but could potentially be very useful for pre-release testing, for example for exposing new releases of the bookkeeping GUI prior to the DIRAC releases, or test new versions of middleware.

Another guideline in this proposal is to avoid having to embed within the DIRAC distribution any external package but rather depend on external distribution of those as well as standard distribution with the LHCb Core Software tools. In particular the binary and library directories within DIRAC don't need to be distributed in the user tarball.

CMT projects and packages

In order to be able to use install_project and SetupProject, LHCbDIRAC and DIRAC must be packaged as a CMT project, which it now is as of v5r0. This consists in the following additions:

  • File <LHCbDiracBase>/cmt/project.cmt : contains the dependencies of LHCbDIRAC in terms of projects
  • Package LHCbDiracSys : this cmt package's requirements file defines the top dependencies of LHCbDIRAC
  • Package LHCbDiracConfig : this cmt package's requirements file defines the versions of external packages to be used, if needed (i.e. not using defaults)
  • Package LHCbDiracPolicy : defines a number of policies for setting the environment. Also contains some utilities used by CMT.
  • Each LHCbDIRAC package listed above contains a cmt/requirements file listing dependencies (not used in a first instance) which is required for getpack. This requirements file could be used for building the package (i.e. creating the necessary file/links in for example the scripts directory, which is currently done by dirac-make) for a single package at once, in order to be able to (re)build a package in a developer's private area. LHCbDiracSys only contains dependencies on the LHCbDIRAC packages such that the whole of LHCbDIRAC can be built from the top.

External dependencies

LHCbDirac has external dependencies on the following sets of packages provided by the LCG-AA:

  • python
  • gLite client tools (gfal, lcg-utils, lfc, proxy handling)
  • pyQt (for the bookkeeping GUI)
  • cx-oracle (for the BK service)
  • mySQL, SQLite and more... (for services)

Only the first 3 are needed for client distribution while the others (and many others) are needed for running services. The issue of installing services will be discussed separately but most of it can also benefit from this distribution toolkit. Additional software must be installed separately as addition to he OS.

All these packages are made available for the LCG-supported platforms in the LCG Applications Area repository (lcg/external). LCG provides CMT requirements files for setting properly the environment for these packages. Any request for new packages or for installing most recent versions can easily be made through the Architects' Forum (and even directly to the person responsible).

LHCbGrid project

The LHCbGrid project is meant at gathering the information about which versions of the gLite middleware and DM middleware are to be used by LHCb, both for DIRAC and for Applications (they also depend on gfal and lfc, and in addition on Castor, dCache and possibly DPM client libraries). A coordination is needed for deciding on which versions have to be used. These versions can be defined within the requirements of the LHCbGrid project and can even be superseded in the DiracConfig requirements file (e.g. for testing or for quick move to a bugfix version).

As most requirements are coming from DIRAC, the changes in versions should be driven by DIRAC's needs. Requirements for getting any new version made available in the lcg/externals should be made by the DIRAC developers to the Architects' Forum via the LHCb representatives: Marco Cattaneo, Philippe Charpentier (in CC).

Some modifications to the LCG externals have already been identified:

  • The DM tools use a directory lib/python for their python binding. They should use lib/python<2digitVersion>. A link lib/python2.4 should be created pointing to the existing directory and the interface packages should be modified accordingly (Stefan, Hubert). Done as of LCGCMT 55c.
  • It is noticed that pyQT has a dependency to the explicit 3-digit version of python for 2.4 but not for 2.5. We can probably survive like this unless we absolutely need 2.4.4 (Ricardo to check). In this case a 2-digit dependency should be used (can already be foreseen? Stefan)

Environment setting

The command SetupProject LHCbDirac [<version>] should be used to set up the environment, using CMT. CMT sets up the environment variables based on the LHCbDiracPolicy that is a copy of GaudiPolicy. It uses the environment variable CMTCONFIG for the platform-dependent part. As most of this dependency is in externals, most is done by LCGCMT and not directly in LHCbDIRAC. The only identified platform-dependent part in LHCbDIRAC is pyGSI that eventually could be sub-contracted to the LCG-AA (Adri to get in touch with Stefan)

CMT uses the InstallArea that may contain the following directories:

  • InstallArea/scripts (link to scripts) : to be set on PATH
  • InstallArea/python (link to the top directory) : to be set on PYTHONPATH
  • InstallArea/<CMTCONFIG>/python<2-digit-version> : to be set on PYTHONPATH
  • InstallArea/<CMTCONFIG>/lib : to be set on LD_LIBRARY_PATH (if needed)
  • InstallArea/<CMTCONFIG>/bin : to be set on PATH (if needed)

For middleware, the LCGCMT interface package also sets the PATH, LD_LIBRARY_PATH and PYTHONPATH such that all commands as well as standalone usage of e.g. python binding can be used. Obviously if this environment is broken accidentally of by setting another environment, these tools might no longer work. However it is assumed users understand what an environment means, which doesn't prevent having as much environment-free software as possible....

The supported platforms are the same as for LCG and the environment variable CMTCONFIG is defining it. A one-to-one correspondence exists between these platform names and the DIRAC ones (note that the LCG platform names are going to change as of LCGCMT 56 and become closer to DIRAC ones).

As the python executables currently deployed are dynamically linked, the lib directory of python must be included in the LD_LIBRARY_PATH. Using any incompatible version of python without the library directory would not work, unless a static version is used.

By default CMT doesn't check for the existence of directories included on the paths. LHCbDiracPolicy ensures that all directories on the paths do exist (copied from GaudiPolicy)

Distribution packaging

We use the standard mechanism that LHCb users are used to for installing locally the software. It is based on the install_project script that itself relies on a description of the dependencies from CMT. For LHCbDIRAC the dependencies are simple:

  • DIRAC project (framework)
  • LHCbGrid project (only interfaces and version definition)
  • gLite client middleware according to the version defined in LHCbGrid, including the file access client libraries (Castor, dCache, DPM). This tarball should contain a reduced list of libraries (limited to those that are used!)
  • Other external packages: python, pyQt and its dependencies

Using install_project ensures that the necessary LHCb scripts are also installed and the configuration is properly defined (such as !SetupProject, LbLogin etc...). If they have already been installed when users have installed the LHCb applications, they will not need to be reinstalled. The use case is mostly for ganga users who anyway need to install the applications, and they should be advised to install LHCbDirac in the same area.

install_project uses a tarball that contains all platform-independent files (shared tarball), and another one with platform-dependent files. These tarballs can be created with a standard script. Like this one can install as many platforms as wanted on the same file system: the shared tarball is only installed once. None of these tarballs should contain the .pyc files as they are non-relocatable. After installation, it would be worth compiling the python modules (post-install action) and create the .pyc files. This is a post-installation for the shared tarball as the .pyc files are platform independent.

By default, LHCbDirac should use the latest version of LHCbGrid, with a given major version (e.g. use LHCbGrid v1r*). In case one urgently needs to use specific versions of the middleware, one can fix them in LHCbDiracConfig if there is no LHCbGrid (yet) that contains these versions. Making a new release of LHCbGrid is the preferred solution for production versions. Versions of externals can also be superseded from options in the SetupProject command (see later).

User check out

getpack and the SVN repository of LHCbDirac have been adapted to allow checking out a LHCbDIRAC package in a user directory (e.g. ~/cmtuser). Some actions must however be taken before running getpack (functionality of setenv<project>). They have been coded in the command setenvProject LHCbDirac <version>. This will create the appropriate structure in the cmtuser directory.

Before using the getpack command you should issue the following commands:

> SetupProject LbScripts
> cd ~/cmtuser/LHCbDirac_<version>=

Using getpack -i allows to select the package in a list of available packages. The utility packages are located at the top level, e.g. getpack LHCbDiracConfig while the actual LHCbDIRAC packages have the hat LHCbDIRAC, e.g. getpack DIRAC/DataManagementSystem. Add the option head if you want to check out the head revision from SVN. One can check out the whole of LHCbDIRAC with the following command: getpack -r LHCbDiracSys

Here are the actions performed by setenvProject LHCbDirac for reference:

  • Create a directory LHCbDirac_<version> in the cmtuser directory (base, as all setenvProject)
  • Create a link to the ./etc directory of the release version of DIRAC.
  • Create a scripts directory and a link to it as ./InstallArea/scripts. This will automatically set the scripts on the PATH
  • Create a link to the base directory as ./InstallArea/python<version>. This will automatically set the PYTHONPATH as the base directory.

Building a local package

Once a package has been checked out using getpack, it can be built running make in the directory. Eventually this utility will be made available from the cmt make command as for regular applications.

The build consists of the following steps (Ricardo should review this!):

  • Set up the scripts directory in order to only have one LHCbDIRAC directory on the PATH environment variable.
  • Compile the python modules (is this necessary?)

Note that not all functionality might be available on all platforms (e.g. gLite on OSX and Windows).

User guide

This is a summary of commands to be used for taking advantage of the DIRAC installation using CMT.

Installing LHCbDirac locally

For installing LHCbDirac on your local machine, you should download and install LHCbDirac, specifying the version <version> Then in your login script you should include:
  • source $MYSITEROOT/LbLogin.csh or . $MYSITEROOT/LbLogin.sh
and to set up the LHCbDirac environment (beware if you use ganga this is not needed as done internally by ganga)

If you see an error message like :

Warning : Cannot add voms attribute /lhcb/Role=user to proxy Accessing data in the grid storage from the user interface will not be possible. 
The grid jobs will not be affected.

It means that you need to install or set the certificates authorities files. Check if the variables $X509_CERT_DIR and X509_VOMS_DIR exists in your environment.

  • If they are set, please verify that the content is up to date.
  • If they are not set, you can use the default location which is installed by LHCbDirac or choose yourself a location where you want to install them.
  1. If you use the default location, you need to keep that copy up-to-date.
    • You can run the Dirac tool dirac-admin-get-CAs.
    • make sure that the variable are set as X509_VOMS_DIR=$MYSITEROOT/LHCBDIRAC/LHCBDIRAC_<version>/etc/grid-security/vomsdir and X509_CERT_DIR=$MYSITEROOT/LHCBDIRAC/LHCBDIRAC_<version>/etc/grid-security/certificates
  2. If you choose an other directory, you need :
    • to add one line in your DIRAC config file ($MYSITEROOT/LHCBDIRAC/LHCBDIRAC_<version>/etc/dirac.cfg)
    UseServerCertificate = no
    Grid-Security = <my_location>
    • to set the variables ($X509_CERT_DIR and X509_VOMS_DIR) to point to this location. (the format should be X509_VOMS_DIR=my_location/vomsdir and X509_CERT_DIR=my_location/certificates).
    • to run the Dirac tool dirac-admin-get-CAs

Nota Bene: to keep up-to-date the directory of certificates, you can use a cron task (see example below)

Create a scripts call CA_update.csh and put the follwoing content inside


SetupProject LHCbDirac
cat $HOME/.lcgpasswd | lhcb-proxy-init -p

exit 0

create a cron entry with this format to run the script CA_update.csh every day at 2h10 on machine_name : 10 02 * * machine_name /my/directory/CA_update.csh

Setup the LHCbDirac environment

  1. On a local installation, first execute source $MYSITEROOT/LbLogin.csh or . $MYSITEROOT/LbLogin.sh
  2. Use SetupProject to set the environment. Most useful forms:
    • SetupProject LHCbDirac: sets up environment for the current CMTCONFIG and the latest version of Dirac
    • SetupProject LHCbDirac --list: lists all versions available
    • SetupProject LHCbDirac <version>: sts up a named version of LHCbDirac
    • SetupProject LHCbDirac --dev: looks in $LHCBDEV rather than $LHCb_release_area (for tests)
    • SetupProject LHCbDirac [<version>] <external> -v <external_version>: overwrites the default version of <external> with the specified version. <external> can be for example gfal, lcgutils, lfc, Python. More than one external can be specified.

Create a user directory

Use the command setenvProject LHCbDirac <version>. This will only work for a release installed in $LHCb_release_area unless you specify the --dev option if LHCbDirac is installed in $LHCBDEV

Check-out LHCbDIRAC packages

This should be done only after the user directory has been created of course (see above)...
  1. source $LHCb_release_area/LBSCRIPTS/prod/InstallArea/scripts/LbLogin.csh (you can put this in your login script, or better activate it adding a file .newLHCBLoginscript in your home directory)
  2. cd ~/cmtuser/LHCbDirac_<version>
  3. getpack <package> [head] where <package> must be preceded by the hat DIRAC/ if it is an actual DIRAC package
  4. ~/cmtuser/Dirac_<version>/DIRAC/dirac-make to create the scripts in the local area

Note that after creating a local user directory for checking out modules, you should renew the SetupProject Dirac command if previously done in order to pick up the local installation.

You can modify your scripts/modules locally and then directly commit them to CVS after testing.

Instructions to create the DIRAC tarball

to tag the package:
  1. svn mkdir -m "tagging v4r9" svn+ssh://svn.cern.ch/reps/lhcb/LHCbGrid/tags/LHCBGRID/LHCBGRID_v4r9
  2. svn cp -m "tagging v4r9" svn+ssh://svn.cern.ch/reps/lhcb/LHCbGrid/trunk/cmt svn+ssh://svn.cern.ch/reps/lhcb/LHCbGrid/tags/LHCBGRID/LHCBGRID_v4r9

to create the distribution package for LHCbGrid:

  2. mkproject -p LHCbGrid -v vXrY -a ng
  3. copy the LCG_interfaces if necessary inside $LHCBRELEASE/LHCBGRID/LHCBGRID_vXrY/LCG_Interfaces
  4. mkproject -p LHCbGrid -v vXrY -a cd
  5. mkproject -p LHCbGrid -v vXrY -a rsrhw

to create the distribution package for DIRAC:

  2. mkproject -p Dirac -v vArB -a n
  3. dirac2AFS -v vArB -p $CMTCONFIG
  4. mkproject -p Dirac -v vArB -a tsrhw
  5. mkLHCbtar -p LHCbGrid -v vXrY -b $CMTCONFIG
  6. mkLCGCMTtar -n LHCBDIRAC_vArB -b $CMTCONFIG --exclude="lib*.a"
  7. mkLHCbtar -p LHCbDirac -v vArB -b $CMTCONFIG

-- PhilippeCharpentier - 18 Nov 2008

Edit | Attach | Watch | Print version | History: r26 < r25 < r24 < r23 < r22 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r26 - 2014-11-13 - JoelClosier
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LHCb All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback