CMT Primer

CMT is the magic that will create all your makefiles for you, so that your code compiles (the 'standalone' version of CMT is RootCore). CMT can also auto-generate the code needed to make configurables (Algorithms, Tools, and Services) available for configuring in python job options. But before doing anything with athena, we should learn how to just use CMT to compile some code, and allow us to use some other code (from another CMT package, which we've either checked out or is available in the athena release you have set up).

cmt commands

The useful commands are:
cmt create : use this to create a new cmt package
cmt show : use this to find out useful information. Specific examples include:
  • cmt show macro_value cppcomp : Shows the compilation command used for this package
  • cmt show macro_value package_shlibflags : Shows the extra flags given to ld.exe when linking the library created by the package. Replace package by name of package
  • cmt show macro_value use_linkopts : Shows the extra flags given to ld.exe if any other package were to 'use' this package and perform a linking on its own library

cmt co -r MyPackage-00-00-01 Path/To/MyPackage : checkout a specific version of a package (uses the $SVNROOT env. var, so set that to switch repositories)
cmt config : you generally only need to call this once, when you create a new package. If you checkout a package using cmt co it gets done for you

Standalone executable with CMT

People associate cmt with athena, but this isn't true. cmt is to rootcore as athena is to eventloop. You can use rootcore without eventloop, likewise you can use cmt without athena. Here's how to write a quick application that compiles in cmt:

1. Create a new package:

cmt create TestPackage TestPackage-00-00-01
2. Create a program source in the src folder of the package, e.g. I put the following in a 'myApp.cxx' file:
#include "TLorentzVector.h"
#include "PathResolver/PathResolver.h"
#include <iostream>

using namespace std;

int main(int, char**) {
  cout << "Hello" << endl;
  TLorentzVector v;
  cout << v.Pt() << endl;
  std::string myFile = PathResolver::find_file("somefile.root","DATAPATH");
  cout << "myFile is " << myFile << endl;
  return 0;
This is a pretty useless program, but will show how I can, e.g. make use of ROOT in my program (TLorentzVector), and use some other package from the athena release (PathResolver). If you save that file, then cd to the cmt folder, and modify the requirements file to show the following:

package TestPackage

use AtlasPolicy AtlasPolicy-*
use PathResolver PathResolver-* Tools
use AtlasROOT  AtlasROOT-* External
apply_tag ROOTMathLibs

application myApp myApp.cxx
The important line is the last one, which says we would like to compile the myApp.cxx file into an application called myApp (myApp.exe from the command line). The four lines above that handle the dependencies...

Dependencies are specified with a 'use' statement. The syntax is use <PackageName> <PackageVersion> <PathToPackage> where PackageVersion is by convention always left 'unspecified' by using the form PackageName-*. You figure out if a PathToPackage is needed by looking for the package in the svn repository. Personally, I go to [] and type in the name of the package, then you look for where the package is relative to the 'atlas' top directory (the top of the svn repository). For example, searching for PathResolver shows the package lives inside Tools.

The first of these two (use AtlasPolicy...) is completely optional at this point, as we haven't actually made use of anything from AtlasPolicy in our code. But there are some goodies in there that may come in handy, and you are encouraged to always have this use statement in your requirements.

The second line (use PathResolver...) gives your code access to the PathResolver package. It comes in every athena release, you do not need to check out PathResolver. You can find PathResolver here: [] for example.

The third line (use AtlasROOT...) gives your code access to the basic ROOT libraries. The fourth line (apply_tag ROOTMathLibs) is an example of a 'switch' or 'option' in cmt, a toggle you can turn on to say do something extra. In this case, it adds the Math ROOT libraries (the TLorentzVector class resides in the Math library, you can see this at the top of TLorentzVector documentation page on the ROOT website). We call these switches in cmt tags. You can see which tags are enabled for your package by typing the command cmt show tags at the command line in the cmt folder of your package. Working out what a specific tag does is trickier, we'll leave that for now. All you need to know for now is that this specific cmt tag gives you access to the math root libraries.

Now just type make in the command line, and your script compiles. You can then execute myApp.exe from the command line. Note that your application (which was built in the TestPackage/x86..../ bin folder) is symlinked in a folder inside $TestArea/InstallArea, which has been created for you by the compilation. That location is in your PATH so you can call myApp.exe from any location.

Lets suppose I wanted to use some configuration file in my application. I hinted at this by using the PathResolver package, which is ideal for locating files. I set it up to look in the DATAPATH env var location - echo $DATAPATH to see where it would look (note that PathResolver also always will look in the directory you run the application from, in addition to the paths given in DATAPATH). So we need to get the configuration file into InstallArea/share (this folder doesn't exist yet, you would create it by hand at this stage, or if we use cmt magic in a minute, we can get it to do it for us). We could just put it there, e.g. if you create some dummy file called somefile.root and put it in the InstallArea/share folder, you'll see the application locates it (you could do this, and run myApp.exe to see!). But the usual thing is to put your config file in the package itself, and in this case we put it in the 'share' folder of the package. create a folder called 'share' in the TestPackage package, and put your dummy file in there. Then add the following line to your requirements file:

apply_pattern declare_runtime files="*.root"

This will 'install' every root file in the share folder of the package into the InstallArea (it creates a symlink in the InstallArea/share folder). You'll see this happen when you then make (call it from the cmt folder, as always):

#CMT---> installing file somefile.root into /var/clus/usera/will/testareas/CMTDemo/InstallArea/share

(note /var/clus/usera/will/testareas/CMTDemo is my $TestArea here)

then execute myApp.exe and you see something like:

myFile is /var/clus/usera/will/testareas/CMTDemo/InstallArea/share/somefile.root

We didn't see that before.. so now we know how to access the configuration file.

Finally, lets send this to the grid, to show how easy it is with cmt to get stuff compiled and running on the grid. We will make use of the $AtlasVersion and $CMTCONFIG env vars that were setup for us when we asetup an athena release (echo them to see what they are in your case). The convention I follow is to create a 'run' directory in the testarea, where I do my submissions from. This is safe in terms of how the testarea is packaged up. So I do:

mkdir $TestArea/run; cd $TestArea/run
prun --athenaTag=$AtlasVersion --cmtConfig=$CMTCONFIG --exec="myApp.exe" --useAthenaPackages --outDS="user.will.myoutput.example/"

Libraries in CMT

To start making code that will work within the athena framework, we need to understand the three types of library that are used. They are defined by the 'pattern' that is applied in the requirements file. All these patterns are only available to CMT when the AtlasPolicy package has its use statement present.

1. Installed libraries

library MyPackage *.cxx
apply_pattern installed_library
The first line says we will create a library with the name of the package, and compile every cxx file in the src folder as part of this library. If you need to add additional locations (e.g. a 'Root' folder has become popular due to RootCore) then add it at the end of the line (with a space between each pattern), working relative to the src directory, e.g. library MyPackage *.cxx ../Root/*.cxx The second line is the magic words needed to an installed library in athena. Installed libraries are appropriate when you just have some code you want to share, and none of it is a configurable - i.e. none of the code can be set up via job options. This type of library creates a file inside the build folder of the package (the build folder will have the same name as the $CMTCONFIG env var).

2. Component libraries

library MyPackage *.cxx components/*.cxx
apply_pattern component_library

Component libraries know how to create the python needed to make the configurables in this package usable inside job options (you'll see a genConf folder appear in the package when you compile them, which contains the python). Note that there is a suggestion in the first line that these packages have a components subfolder inside the src directory. This is a feature of component libraries which will get on to shortly. Component libraries cannot be linked against - you cannot put code in them that will be used (e.g. through include statements) in code in other packages.

3. Dual Use Libraries

apply_pattern dual_use_library files="*.cxx"
This is a library that is a component library that can also be linked against. This is the catchall case, but has a bit of compile-time overhead. Like the other libraries, you can add extra source code directories inside the quotation marks... but note that you don't need to specify the components directory, it is implicitly assumed by dual_use_library.

4. Named installed libraries
This is included here for completeness. In some rare cases, it is desirable to create an installed library with a particular name. Then do, e.g:

library MyPackageLib "../Root/*.cxx"
apply_pattern named_installed_library library=MyPackageLib

You can then replicate the behaviour of a dual use library by adding:

library MyPackage *.cxx components/*.cxx
apply_pattern component_library
macro_append MyPackage_dependencies " MyPackageLib"
The necessity of all of this needs to be checked, because it seems fine to just use dual_use_library instead.

An example of where using component libraries stops compilation

If you are using to working in the rootcore build system, then every library in that system is effectively a dual use library: there is no limitation to what you can link against between libraries. But in CMT, if you try to include a header file that has some of its implementation outside of the header folder (i.e. it has methods (that are not pure abstract) are are implemented outside of the header file or an icc file in the header folder), then you will see errors when you compile like:

#CMT---> building shared library ../x86_64-slc6-gcc48-opt/
../x86_64-slc6-gcc48-opt/MyLibrary.o: In function `MyLibrary::m_toolsInit()':
mycode.cxx250: undefined reference to `Something::Something::Someting(std::string const&)'
The sign to look for is these errors appearing after the building shared library line: i.e it is the linking that is failing! This is a case where something has tried to include the header file for a class called Something from a package that is not an installed or dual_use library. The only header files you can take from component libraries are typically just the Interface classes!

A full requirements file example

Below is a requirements file for a typical component library (a package with configurables in it).
## automatically generated CMT requirements file
package MyAthenaxAODAnalysis
author  will

use AtlasPolicy         AtlasPolicy-*
use GaudiInterface      GaudiInterface-*        External

use AthenaBaseComps   AthenaBaseComps-*     Control

library MyAthenaxAODAnalysis *.cxx components/*.cxx
apply_pattern component_library

#this pattern will allow the 'include' function of joboptions to find the *.py files in the share directory
apply_pattern declare_joboptions files="*.py" 
#this pattern will install any *.py files in the python directory.
apply_pattern declare_python_modules files="*.py"
#installs root and xml files from the share directory
apply_pattern declare_runtime files="*.root  *.xml" 

The 'use' statement inside the private end_private block is because the inclusions from this package only reside in the src directory, not the header directory. Hence you dont need anything external to the package to know about this use statement, so can put it in the private block. It's relatively harmless to leave it out of a private block, but you might then find the script complains (probably just warnings, not errors). If you find checkreq is hindering rather than helping you, you can disable it in your package with:

action checkreq " echo 'disabling checkreq!!' "

Putting comments in requirements files

Note that you must NOT put comments at the ends of lines in requirements files. Comments must be on their own line, and they start with # symbols.

When to use private blocks

Never put the library and apply_pattern lines inside a private block for an installed or dual_use library. If you do, any package that tries to link against your package's library will fai: it wont be able to find the definitions and you'll see errors like:

../x86_64-slc6-gcc48-opt/MyAlg.o: In function `MyAlg::~MyAlg()':
/var/clus/usera/will/testareas/ValidateDPS/MyPackage/cmt/../src/MyAlg.cxx:13: undefined reference to `TheAlgInTheOtherPackage::~TheAlgInTheOtherPackage()'
../x86_64-slc6-gcc48-opt/MyAlg.o: In function `MyAlg::MyAlg(std::string const&, ISvcLocator*)':

It is ok for a component_library to have its library and apply_pattern lines inside a private block because nothing will link against the component_library.

rootcint dictionary

You can make a rootcint dictionary (necessary if your class inherits from a TObject) by doing (in an installed_library):

apply_pattern have_root_headers root_headers="list.h of.h headers.h with.h TObjects.h in.h them.h LinkDef.h" headers_lib="MyPackage"
where you have listed all the headers that contain TObjects (or need cint dictionaries for any other reason). The LinkDef must come last. The paths are relative to the header directory.

genreflex dictionary

If you want to create a reflex dictionary you should do:

use AtlasReflex AtlasReflex-* External
apply_pattern lcgdict dict=MyPackage selectionfile=selection.xml headerfiles="../MyPackage/MyPackageDict.h"

This creates a library called MyPackageDict. The whole thing is in a private block to optimize build time (it would still work outside of a private block).

rootcint and genreflex dictionary at the same time

If you want to have both a cint dictionary and a reflex dictionary, unfortunately the lcgdict pattern ends up interfering with the call to rootcint made by have_root_headers (it ends up adding an extra header to the list of headers in root_headers, and rootcint insists the LinkDef.h file comes last, so it fails). To overcome this, you can either create the reflex library with a different name:

macro_append SomeOtherName_dependencies " MyPackage "
apply_pattern lcgdict dict=SomeOtherName selectionfile=selection.xml headerfiles="../MyPackage/MyPackageDict.h"

which will create a library. Or you can keep the convention of the dictionary being named ≶NameOfPackage>Dict by instead changing the name of the main library:

library MyPackageLib ../Root/*.cxx
apply_pattern named_installed_library library=MyPackageLib
apply_pattern have_root_headers root_headers="..." headers_lib="MyPackageLib"
use AtlasReflex AtlasReflex-* External
macro_append MyPackageDict_dependencies " MyPackageLib "
apply_pattern lcgdict dict=MyPackage selectionfile=selection.xml headerfiles="../MyPackage/MyPackageDict.h"

The append to MyPackageDict dependencies is because by default the dependency will be for a library with the same name as the package (do cmt show macro_value MyPackageDict_dependencies), but if you are doing a named installed library, you will have to manually add it to the dependencies of MyPackageDict, so that the named library (MyPackageLib) will be built before the reflex library (MyPackageDict)

(You'll see some 'MyPackageLibDict' files appears in the build directory, this is stuff the have_root_headers pattern is making)

Conditional compilation with cmt

This section is specifically to help people writing packages intended for use in both the cmt analysis releases and the cmt full athena releases.

1. you will need to hide parts of the source code from cmt in the analysis releases. Do this using #ifndef XAOD_ANALYSIS preprocessor directives. You will need to add this particulary in the components/..._entries.cxx file in your package (if it has one) to hide any parts that you dont want to be part of the cmt analysis releases. You can also use this technique to hide each part of the code in the src directory too, if that is what you want to hide from the analysis releases, but you may prefer to do the next thing ...

2. If you don't want to compile anything from the src directory except for the stuff in the components dir, (i.e. make cmt pretend to be like rootcore), you can do:

macro what_to_compile "*.cxx components/*.cxx" AthAnalysisBase "components/*.cxx"
library MyPackage $(what_to_compile)
apply_pattern component_library
You can use the above technique in dual use and installed libraries too. You're effectively defining a macro that takes one value by default, but a different value in AthAnalysisBase

3. To do a conditional use statement (to make the cmt warnings about missing packages go away), instead of:

use Package Package-* Path/To
please do:
use_ifndef pplist="XAOD_ANALYSIS" pkg="Path/To/Package"

'Out of source' compilation with cmt

cmt will compile packages by default into the ../$CMTCONFIG directory inside the package. If you want to change this location, you need to set the bin macro to the location you want, e.g. in the requirements file you would add:

macro bin $(TestArea)/$(tag)

This will build into the $TestArea/$CMTCONFIG directory (the tag macro is the same value as the $CMTCONFIG).

You must compile the package with a cmt make command, using just make will not be enough.

Note also that this unfortunately doesn't work for component libraries because the in-package build directory structure is currently hardcoded in the listcomponents makefile fragment - but if this was updated, in theory all packages could be built out of source.

Notes on Multithreading

To read something (from storegate):

To write something (to storegate):

SG::WriteHandleKey<MyObj> m_writeKey;


declareProperty("Key1", m_writeKey="MyKey");


CHECK( m_writeKey.initialize() );

execute_r: (for execute, just leave off the ",ctx" parts in the handles)

std::unique_ptr<MyObj> obj = std::make_unique<MyObj>(); //create object (preferred way!)

CHECK( SG::WriteHandle<MyObj>(m_writeKey,ctx).record( std::move(obj) ) ); //if you no longer need to access obj

SG::WriteHandle<MyObj> h_write(m_writeKey,ctx); CHECK(h_write.record(std::move(obj))); 
h_write->method(); //can dereference the handle to keep accessing our non-const obj

-- WillButtinger - 26 May 2014

Edit | Attach | Watch | Print version | History: r15 < r14 < r13 < r12 < r11 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r15 - 2017-03-20 - WillButtinger
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback