Shaun Roe's TWiki

Introduction

I am working at CERN on the ATLAS experiment, for the inner detector. My current areas of expertise include XML and web technologies, semiconductor sensor applications, C++ and Python programming.

Recently I have worked on the NewSmallWheelsMetrology setup, a Labview program and metrology table for determining the dimensions of boards for the Atlas 'New Small Wheels' upgrade.

Definition of a JSON format for Database 3

Introduction

The following is the JSON definition of a possible representation of data currently stored in COOL, containing both metadata and the payload.

Definition

The entire metadata+payload is a single JSON Object
{
 <body>
}

The 'body' must consist of at least three objects separated by commas:

"node_description": "<description>",
"folder_payloadspec": "<payload spec>",
"data_array": [<array of json objects>]

<description> is the currently used folder description, e.g.

"<timeStamp>run-lumi</timeStamp><addrHeader><address_header service_type=\"71\" clid=\"1238547719\" /></addrHeader<typeName>CondAttrListCollection</typeName>”

<payload spec> is a string of the COOL attribute specification names and types in the format:

"var_name0: var_type0, var_name1: var_type1, ..."

e.g.

 "stave: Int32, eta: Int32, mag: Float, base: Float, free: Float"

the data_array is an array of JSON objects with the channel number as the object name, and the value is an array of values conforming to the payload specification. e.g. for the above specification,

[{ "100" : [ 0, 0, 0, 0, 0]},
{ "200" : [ 1, 0, 0, 0, 0]},
{ "300" : [ 2, 0, 0, 0, 0]},
{ "400" : [ 3, 0, 0, 0, 0]},
{ "500" : [ 4, 0, 0, 0, 0]},
{ "600" : [ 5, 0, 0, 0, 0]},
{ "700" : [ 6, 0, 0, 0, 0]},
{ "800" : [ 7, 0, 0, 0, 0]},
{ "900" : [ 8, 0, 0, 0, 0]},
{ "1000" : [ 9, 0, 0, 0, 0]},
{ "1100" : [ 10, 0, 0, 0, 0]},
{ "1200" : [ 11, 0, 0, 0, 0]},
{ "1300" : [ 12, 0, 0, 0, 0]},
{ "1400" : [ 13, 0, 0, 0, 0]}]

String values (this includes pool references) are enclosed in quotes. Null values are represented by the lowercase unquoted word null. String values may be interspersed with other data, like

{ "1200" : [ 11, "myString", 0, 0, 0]},
{ "1300" : [ 12, "secondString", 0, 0, 0]},
{ "1400" : [ 13, "thirdString", 0, 0, 0]}]
In all cases however, the order in the array must correspond to the order in the type definitions of the folder_payloadspec. Blobs should be base64 encoded as a string and enclosed in quotes. Clobs should be enclosed in quotes.

Where the payload is a CoolVector or CoraCool payload, the value is an array of arrays at each channel number.

The JSON may additionally have the following objects, separated by commas within the body:

  • "nchans" : total number of channels
  • "tag_name": name of the tag, if other than 'HEAD'
  • "stime" : start time for this IOV (run-lumi or ns of unix epoch, as in COOL)
  • "modified_channels": (on write) array of modified channel numbers
  • "streamer_info": container for type/serialisation information (e.g. method, versions, list of types... it is free form and up to the user)

These optional parameters are provided on write to allow possible server-side optimisation during COOL migration.

Modifying and testing IOVDbSvc

Introduction

The instructions below are for modifying and testing the IOVDbSvc. They form a basis for diagnosis and further development to extend its capabilities.

Working in Athena

If you are new to Athena, you will need to follow the instructions in the following section (Workflow for Atlas 'dev' development with Git on lxplus) for the sparse checkout of the single package IOVDbSvc, https://gitlab.cern.ch/atlas/athena/-/tree/master/Database/IOVDbSvc.:

(assuming you have already forked to make your own repository)

mkdir athena build run
git-atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git
cd ~/athena
git-atlas addpkg IOVDbSvc
git fetch upstream
git checkout -b 22.0-extend-IOVDbSvc upstream/master --no-track

IOVDbSvc, a first look

The 'workhorse' of the IOVDbSvc is the IOVDbFolder; many of the other classes and functions are supporting this one class. The IOVDbFolder reads the data and fills the COOL data structures and then puts the data structures into StoreGate with a string key. StoreGate is a huge grab bag of data which can be retrieved with a StoreGate 'readHandle' and the key in the client classes. The complexity of the class is due to a) Caching mechanism which tries to pre-load the correct IOV's data into StoreGate b) The many options which exist for modifying the caching c)The different sorts of data structure (AttributeLists, COOL Vector, POOL reference..)

Tests

Unit tests

To get a feeling for the functioning of theIOVDbSvc, it is instructive to look in the 'tests' directory and try modifying some to see what happens. The tests are compiled (see the CMakeList.txt file, add_atlas_test section) as part of the normal make process but they are not run by default (note: If you change the package and merge your changes, the tests will run in the C.I. on git, and will not pass review if the tests fail). Unit tests normally take a few seconds to run.
cd ~/build
cmake ../athena/Projects/WorkDir
make
cd ~/run
source ../build/x86_64-centos7-gcc11-opt/setup.sh

To run the tests with minimal output you can simply make the package as shown, then 'cd' to the build directory and type

ctest
. This will run the tests and tell you which tests were run and whether they passed or not to get slightly more information you can run with option
ctest -V
.
The tests themselves are built as executables and can be found in the ~/build/Database/IOVDbSvc/test-bin/ directory. The BOOST framework tests offer a number of options for the detail of output which are worth investigating. One example is the following:

~/build/Database/IOVDbSvc/test-bin/FolderTypes_test.exe -l all

(-l all means "all levels of reporting", which will also give informational test messages, if there are any). In this test, a sample sqlite database is created and folders created. This is then interrogated to find the folder type.

Integration tests

The 'integration tests' are in fact full jobs of various kinds (cosmics, data 2018, montecarlo) which access the database and do a full reconstruction of 25 events. On a dedicated machine, this takes ~ 5-10 minutes, but on generic lxplus it can take 30 minutes. My 'usual' go to test is a reconstruction of 2018 data and was referred to as the 'q431' test. This has recently changed to q442, and is run with:
Reco_tf.py --AMI q442

Workflow for Atlas 'dev' development with Git on lxplus

Motivation

The instructions here are derived from the ATLAS git tutorial, but bring together the bare essentials for a particular case:

  • Working on 'dev'
  • Working from lxplus
  • Using a sparse checkout

..and presents them linearly as one complete workflow. As such it omits many asides and additional information which can be found in those pages.

Things you'll do once

Fork the repository

'Forking' duplicates the repository to another publicly accessible repository. Go to

https://gitlab.cern.ch/atlas/athena

and press 'Fork'. You will be given a new web page with a selection marked Click to fork the project to a user or group. Click on your own name. This will take you to your newly created forked repository. You need to do one additional thing...

You need to add one member as developer to your Fork; click on the 'settings' cog in the top right corner and look at 'members'. Write the name 'atlasbot' and change the role to 'developer', then click 'Add to project'. Your fork is now ready; the next operations need to be done on lxplus.

Configure git for future use

The first time you use git, you will need to configure it with some specific settings by running the following commands:
setupATLAS
lsetup git
git config --global user.name "Your Name"
git config --global user.email "your.name@cern.ch"
git config --global push.default simple
git config --global http.postBuffer 209715200
git config --global http.emptyAuth true
The first two commands give you a newer version of git; the others commands create a .gitconfig file to be used in future operations. Also: ensure you remove
testarea = <pwd> 
from your ~/.asetup file

Things you'll do each time you start a new workflow/project

Clone your repository

'Cloning' will copy your forked repository to your local area. This can copy the entire repository (so be sure you have enough disk space). I'm going to try the alternative here, assuming I only want to work on one package, and do a sparse checkout. Firstly, you will need to ensure you have setup the Atlas environment and got a newer version of git. This is something you will be doing each time you want to use git.
setupATLAS
lsetup git

The sparse checkout seems to assume the existence of a directory called athena, so we start by creating one, and then use the atlas specific command git-atlas:

mkdir athena build run
git-atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git
The 'git-atlas' command here also adds the original repository as an upstream repository, so you don't need to type an additional command to do this. This hasnt checked out your package yet; to do that we use another git-atlas command in the 'athena' directory (here I'm working on SCT_Cabling as an example):
cd ~/athena
git-atlas addpkg SCT_Cabling
You will create a branch for your development (why can't I just develop in master of my clone?) as follows:
git fetch upstream
git checkout -b 22.0-branchname upstream/master --no-track
'git fetch upstream' should make your branch reflect any changes in the master repository (why is this necessary if you've just cloned and created the branch?)

If you do a sparse checkout, there is no need to define a 'package filter'.

Your branch is now ready for development.

Things you'll do every time you start working on your existing project

Setup

You need to setup an atlas release, using the familiar asetup command. Assuming you have just logged in and want to work on an existing repository in your athena area, you do:
setupATLAS
lsetup git
cd ~/build
asetup master,latest,Athena
Note that to set up an alternative nightly, you can use the date in the following format: r2018-04-07. Available builds can be (just about) found here: http://atlas-nightlies-browser.cern.ch/~platinum/nightlies/globalpage

Building

The make/build command is executed from your 'build' directory (sibling to the athena directory):
cd ~/build
cmake ../athena/Projects/WorkDir
make
If you see cmake errors after the cmake command check that you have removed the "testarea = " line from your .asetup file. If you need to re-run the command, delete the ~/athena/CMakeLists.txt file first.

Running

You need to add your local packages to your run environment:
cd ~/run
source ../build/x86_64-centos7-gcc11-opt/setup.sh
(for the centos environment, since Feb 2019) ...and run as usual
athena ../athena/InnerDetector/InDetDetDescr/SCT_Cabling/share/TestSCT_Cabling.py

Testing

Certain tests are run as part of the 'continuous integration' (see below) when you ask for a merge request. You may also run these tests individually, for example
Reco_tf.py --AMI q221

More details on the tests can be found on the 'FrozenTier0' Twiki.

Committing code

Note that you now have three repositories: The original master (you can't touch this directly, so don't worry), your publicly accessible fork, and your local clone (in which you have created a branch).

Changing a file causes git to lose track of it. If you want git to 'know' about a modified or newly added file, you must 'git add '. The command "git status" is very useful in this context, and will list files you have modified but not yet added. so, I have modified InnerDetector/InDetDetDescr/SCT_Cabling/share/TestSCT_Cabling.py and with git status I can see:

cd ~/athena
git status
On branch 22.0-branchname
Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)

   modified:   InnerDetector/InDetDetDescr/SCT_Cabling/share/TestSCT_Cabling.py

Untracked files:
  (use "git add <file>..." to include in what will be committed)
.
.

Now I "git add" the file and git status again:
git add InnerDetector/InDetDetDescr/SCT_Cabling/share/TestSCT_Cabling.py
git status
On branch 22.0-branchname
Changes to be committed:
  (use "git reset HEAD <file>..." to unstage)

   modified:   InnerDetector/InDetDetDescr/SCT_Cabling/share/TestSCT_Cabling.py

Untracked files:
  (use "git add <file>..." to include in what will be committed)
...and finally I "git commit":
git commit -m "Update geometry tags for run 2"
(Note: Normally you would just type 'git commit' ; an editor will open where you can enter the kind of information which might previously have gone into the Changelog) The file has now been updated in my local repository; I could still lose it by deleting my athena directory. To send it to my 'forked' publicly accessible repository, I must "git push" and ensure the branch in my local repository is created in my forked remote repository:
git push --set-upstream origin 22.0-branchname

'Merge request'

The code is now sitting in a branch of the forked repository, and you'd like it to be merged back to the original master repository which everyone is using. Go to the page of your forked repository and find your branch; you might have to scroll a few pages to find it, or you can use the 'filter' box to search on its name. Click on Merge request, and fill in the details; by default, the request is to merge with the master (which is 'dev'). Click on Submit Merge Request.

The subsequent page shows you a summary of the merge request with various labels assigned. Resist the temptation to click on the orange 'CLOSE' button of this page, as this will 'close' your request and no further action will be taken.

What happens next?

After you have a made a merge request, and provided you remembered to add the 'atlasbot' as developer to your repository, some labels will be added automatically and your code will go into the continuous integration system. This will compile your code, run your unit tests, and run the standard 'q-tests' to ensure you didn't break other code. Then your code will go into review. The reviewers will often make comments which you should take note of and resolve. If your code is then marked as 'review approved', it will go to one of the ~5 people who are authorised to merge your code into the main repository. This whole process may be as quick as one day, or it may last a week if you are asked to make several corrections.

Picking up where you left off...

I have encountered the following scenario: I complete my code changes and submit the merge request, then I delete my local athena, run, build directories and start something new. The merge request reviewer then finds that an improvement can be made; how now to change that code? I will need to checkout the code again from my branch in the forked repository, and I found I can do the following:
mkdir -p athena run build
setupATLAS
lsetup git
cd ~/build
asetup master,latest,Athena
cd ..
git-atlas init-workdir -b 22.0-MyBranchName https://:@gitlab.cern.ch:8443/sroe/athena.git 
cd ~/athena
git checkout 22.0-MyBranchName
git-atlas addpkg MyPkgName
(change commands for your own user name)

Utilities

The Changelog is no more; to see the commit comments entered with each commit, use
git log <package_directory>

A prettier display and limited to a recent time frame can be obtained with

git log --since='2 months ago' --color --graph --abbrev-commit --pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr)%C(bold blue)<%an>%Creset'

Updating your fork to master

cd ~
git clone https://:@gitlab.cern.ch:8443/sroe/athena.git
cd athena
git remote add upstream https://:@gitlab.cern.ch:8443/atlas/athena.git
git fetch upstream
git merge upstream/master
git push --set-upstream origin master

ART tests

Introduction

Despite the existance of a tutorial and a TWiki I pretty much get confused every time I come back to this, so here are the 'flattened' instructions specifically for InDetPhysValMonitoring.

Prerequisites

Having checked out IDPVM (and checked it compiles ok), you can use (from the user directory)
lsetup art
voms-proxy-init -voms atlas
to set up art and ensure you have a grid certificate.

First attempts

cd ~/run
art.py run --type=grid ~/athena/InnerDetector/InDetValidation/InDetPhysValMonitoring    outdir
works ok; if you see an error like 'No tests found in directories ending in "test"', it maybe because:
  • missing -- symbol before 'type'
  • wrong path, or adding the '/test' to the path
  • test itself is badly named ( it must have the prefix 'test_')
The output (log files etc) appears in the outdir/InDetPhysValMonitoring/test_ttbarsimreco directory, ~/run/outdir/InDetPhysValMonitoring/test_ttbarsimreco

Complications: Using a grid container input

Some tests specify an input which is directly a grid container and trying to run the ART job locally as above results in an early exit as the input file will not be found. This is typically something you want to do while debugging the tests (so you must remember to revert the changes when you finished debugging). A good example of this are the new (in 2020) tests in InDetPhysValMonitoring which specify in the test script:
# art-type: grid
# art-input: user.keli.mc16_13TeV.422032.ParticleGun_single_mu_Pt1.recon.RDO.e7967_e5984_s3126_r11774_tid20254908_00
# art-input-nfiles: 10
# art-cores: 4

Listing files in a grid input

First setup rucio:
lsetup rucio
Now you can see what files there are:
rucio list-files user.keli.mc16_13TeV.422032.ParticleGun_single_mu_Pt1.recon.RDO.e7967_e5984_s3126_r11774_tid20254908_00
which results in the following output:
+---------------------------------------------+--------------------------------------+-------------+------------+----------+
| SCOPE:NAME                                  | GUID                                 | ADLER32     | FILESIZE   |   EVENTS |
|---------------------------------------------+--------------------------------------+-------------+------------+----------|
| mc16_13TeV:RDO.20254908._000002.pool.root.1 | 5E98A824-1B8F-8D40-BDE2-982F91A24C6F | ad:744e48c6 | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000006.pool.root.1 | 1ED69C25-2827-E94B-A7C7-574DF0EEA514 | ad:8a455460 | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000007.pool.root.1 | CA3F494F-9927-3F42-939D-EA33DEA4A08A | ad:c30091de | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000008.pool.root.1 | 36951CBE-513E-664A-9581-B441D0803F42 | ad:f9c6074c | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000009.pool.root.1 | 139B1982-C483-6246-BF10-77CEF8959A71 | ad:cfa0c187 | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000010.pool.root.1 | 7117FC61-CB50-484B-A4FF-2C1FF06D41F6 | ad:57891230 | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000012.pool.root.1 | A7B2EA25-3C89-F34B-B55E-18DFA57E43CF | ad:40915520 | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000013.pool.root.1 | 6255C9B3-73FF-BE41-8203-DEAFBCB2246B | ad:7ce6fa59 | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000019.pool.root.1 | 7663F058-3CDE-FD47-A8E1-763191691D4A | ad:e5cedf67 | 2.236 GB   |     2000 |
| mc16_13TeV:RDO.20254908._000020.pool.root.1 | 56524AE1-E2CB-294F-AC2C-90C15E515534 | ad:e609dbda | 2.236 GB   |     2000 |
+---------------------------------------------+--------------------------------------+-------------+------------+----------+
Total files : 10
Total size : 22.360 GB
Total events : 20000

Downloading the grid data

Now thats a lot of data and maybe for local running you don't need it all; let's download just one file to the 'scratch' area:
cd /tmp/sroe
rucio download  mc16_13TeV:RDO.20254908._000002.pool.root.1 
This (after some output indicating successful download) results in a new directory, mc16_13TeV, being created. In that directory, you should find the file you wanted.

Using the downloaded data

You now have to edit the test script to indicate the new input file, e.g.:
 x="/tmp/sroe/mc16_13TeV/RDO.20254908._000002.pool.root.1"
.
.
.
 Reco_tf.py \
      --inputRDOFile $x \
.
.
.
The important thing is that the --inputRDOFile should point to the path of the new input file. If you just want to run one test, you can (temporarily) change permissions on the files in the directory 'test', with
chmod a-x *.sh
and only enable execution on the test you want:
chmod a+x test_myTest.sh
Then running the ART script will only run the one executable test and skip the others (is there another way? I should check) Now you can run the art job locally as above

Working on indicomb

Introduction

Indicomb is a web scraper for indico meetings matching specific categories. Instructions here should be valid for any similar repository.

Do once

Go to https://gitlab.cern.ch/indicomb/indicomb and click on 'fork'

In a terminal on lxplus...

Do the following :
setupATLAS
lsetup git
git clone https://:@gitlab.cern.ch:8443/sroe/indicomb.git
This has created an 'indicomb' directory with the git repository. Now add the main repository as 'upstream':
cd indicomb
git remote add upstream https://:@gitlab.cern.ch:8443/indicomb/indicomb.git

Developing code

Fetch the upstream version:
git fetch upstream
...and create a branch for your development version
git checkout -b xhtml_compliance upstream/master --no-track

No develop, commit and push as for athena

ITkLayout work

Introduction

I will be working locally on my mac as far as possible, but may need to revert to using lxplus where that breaks down, and will certainly need to use lxplus when running Athena. Why? Because I expect to be largely working with the XML, and I have the luxury of Oxygen (an XML editor) installed locally.

Getting started

Accessing git

I'm using
https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts/-/tree/main/ITKLayouts
.

Things you will do once

In order to access the repository from my mac, I had to copy the .ssh keys from lxplus to my mac .ssh directory. I also followed the instructions (slides 17/18) on my forked directory to add the CERN_USER and SERVICE_PASS variables. I then used
git clone -b master-Pixel-to-GeoModelXML https://gitlab.cern.ch/sroe/ITKLayouts/

to get a local copy of the repository. I'll see later whether this is sufficient when I come to pushing and merge requests. Having done the above, I can open ITkPixel.gmx and the DTD, geomodel.dtd and run a validation successfully in my editor. I also confirm that the type-ahead functionality works for my editor. Note that the ITkPixel.gmx only exists in the master-Pixel-to-GeoModelXML branch, not in master.

What you need to do every time

GeoModel, gmex installation and running

Introduction

There are three components to getting a visualisation of the detector:
  • The visualisation package itself, gmex which gives the user experience
  • The GeoModelXML plug-in which enables XML description files to be read and converted to gmex internal representation
  • The source xml description files

This is a personal experience of following the https://geomodel.web.cern.ch/geomodel/home/start/install/ I am installing on a MacBook Pro (2015 model) running Mac OS 11.2.3. There are various pre-existing installations on this machine, including the 'brew' package installation software. I'll be installing in a directory called GeoModel.

Day 1 : Installing/running the visualisation toolset

gmex: Starting with homebrew

brew tap atlas/geomodel https://gitlab.cern.ch/GeoModelDev/packaging/homebrew-geomodel.git 
My version of homebrew needed updating, so this takes a minute. However, on taking the next step:
brew update
I get an error, and a message that I need to upgrade. So here we go,
brew upgrade
...and again an error:
Error: Not a directory @ dir_s_rmdir - /usr/local/Cellar/qt5
Somewhat optimistically, I retry the brew update and brew upgrade commands again and note the output differs somewhat, so subsequent commands seem to make progress. Various errors occur and one suggestion to 'untap' a science package (which I follow), and I keep on with brew update and brew upgrade, somewhat blindly; again, it seems to make some progress, upgrading all my packages.

Installing packages

brew install geomodel-visualization
brew install geomodel-fullsimlight
brew install geomodel-tools

These all run ok, although I had to change ownership on the fullsimlight files, according to the suggestion from homebrew.

A first visualisation

Download the geometry files
curl https://atlas-vp1.web.cern.ch/atlas-vp1/geometryfiles/geometry_atlas.db -o geometry_atlas.db
Gives a 42.6Mb db file. Now the command to see whether it all works:
gmex geometry_atlas.db

Which gives me the usual black rectangle, but I can go to the geo tab and turn on some things and then I see the shapes! Hurray!

GeoModelXml Installation

The instructions at Nigel Hessey's git repository are fairly straightforward. But on Big Sur there are problems:
Imported target "ZLIB::ZLIB" includes non-existent path

    "/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.1.sdk/usr/include"
The answer to why is complex, but the solution amounts to a few more commands:
export LDFLAGS="-L/usr/local/opt/zlib/lib;-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib"
export PKG_CONFIG_PATH="/usr/local/opt/zlib/lib/pkgconfig"
cd ../build_gmx
rm -rf *
cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=../install ../geomodelxml
make install

A visualisation from GeomodelXml

The gmex package needs to be told to load the GeoModelXML plug-in, but first the system locale language must be US english:
export LANG=en_US.UTF-8

If you simply follow the instructions, and use - gmex /Users/sroe/install/lib/libGMXPlugin.dylib -

gmex /usr/local/lib/libGMXPlugin.dylib
then gmex will open and give an error message about a missing plug-in, and then close. What is missing? The plug-in itself dynamically loads a library and must be told where to find it:
export LD_LIBRARY_PATH=/Users/sroe/install/lib/:$LD_LIBRARY_PATH

With this in place, it is possible to open and visualise a default geometry in XML format which should be named 'gmx.xml' in your local directory.

Visualising the ITk

From the git repository, I did an xmllint --noent to build the full XML into one file (probably not necessary):
cd /Users/sroe/ITKLayouts/ITKLayouts/data/PixelGeoModelXml
xmllint --noent ITkPixel.gmx >fullPixel.xml
cd ~
ln -s  /Users/sroe/ITKLayouts/ITKLayouts/data/PixelGeoModelXml/fullPixel.xml gmx.xml
and then softlinked this to gmx.xml in my (home) directory from where I started gmex. By default, gmex will open this file using the plug in as indicated above. The first view is a rather uninspiring gray cylinder; this is the envelope. In order to see the components, select the red arrow (upper right of the gmex screen) and command-click the cylinder. Now the component volumes (modules etc) become visible. Colours: You can download an example colour file and link it as a hidden file in the home directory.
cd ~
curl https://gitlab.cern.ch/dellacqu/gmx_tutorial/-/raw/master/config/gmexMatVisAttributes.json?inline=false -o .gmexMatVisAttributes.json

Day 2 : Running simulation

I am running in master, and setup athena latest version (30 August). The command I've been given to try is:
cd ~/run
Sim_tf.py --CA --inputEVNTFile "/cvmfs/atlas-nightlies.cern.ch/repo/data/data-art/SimCoreTests/valid1.410000.PowhegPythiaEvtGen_P2012_ttbar_hdamp172p5_nonallhad.evgen.EVNT.e4993.EVNT.08166201._000012.pool.root.1" \
--outputHITSFile "test.NEW.HITS.pool.root" --maxEvents 10 --detectors ITkStrip ITkPixel Bpipe --geometryVersion 'default:ATLAS-P2-ITK-24-00-00' \
--conditionsTag 'default:OFLCOND-MC16-SDR-15' --DataRunNumber '284500' --physicsList 'FTFP_BERT_ATL' --truthStrategy 'MC15aPlus' \
--simulator 'FullG4MT' --preExec 'ConfigFlags.ITk.useLocalGeometry = True' --preInclude 'SimuJobTransforms.BeamPipeKill,SimuJobTransforms.FrozenShowersFCalOnly,SimuJobTransforms.TightMuonStepping' \
--imf False

This exits with :

EVNTtoHITS 11:11:02 RuntimeError: No such flag: ITk.useLocalGeometry  The name is likely incomplete.
In lxr I find usage of GeoModel.useLocalGeometry, so I try that. This gets a bit further, but exits with an error 'Cannot Parse the XML description'. Likely, I haven't told it where to find the XML geometry.

Backtracking: look at the Twiki info : https://twiki.cern.ch/twiki/bin/viewauth/Atlas/ItkSimulationAndPerformance#Running_simulation_and_digitizat, there I find:

Sim_tf.py \
--CA \
--inputEVNTFile "/cvmfs/atlas-nightlies.cern.ch/repo/data/data-art/InDetSLHC_Example/inputs/EVNT.09244578._000001.pool.root.1" \
--outputHITSFile "test.NEW.HITS.pool.root" \
--maxEvents 2 \
--geometryVersion 'default:ATLAS-P2-ITK-24-00-00' \
--detectors ITkStrip ITkPixel Bpipe \
--conditionsTag 'default:OFLCOND-MC16-SDR-15' \
--DataRunNumber '284500' \
--physicsList 'FTFP_BERT_ATL' \
--truthStrategy 'MC15aPlus' \
--simulator 'FullG4MT' \
--preInclude 'SimuJobTransforms.BeamPipeKill,SimuJobTransforms.FrozenShowersFCalOnly,SimuJobTransforms.TightMuonStepping' \
--preExec 'ConfigFlags.GeoModel.useLocalGeometry = True' \
--imf False
same problem; I need to tell it where to find the XML?

In lxr, I find, for example, that

# take geometry XML files from local instance rather than Detector Database, for development
0010   itkcf.addFlag("ITk.pixelGeometryFilename", "ITKLayouts/Pixel/ITkPixel.gmx")
0011   itkcf.addFlag("ITk.stripGeometryFilename", "ITKLayouts/Strip/ITkStrip.gmx")
0012   itkcf.addFlag("ITk.bcmPrimeGeometryFilename", "ITKLayouts/Pixel/BCMPrime.gmx")
..but what is that path relative to?

The ITKLayouts repository suggests doing

cd ~/athena
git clone -b main https://:@gitlab.cern.ch:8443/Atlas-Inner-Tracking/ITKLayouts.git
so I do this and then
cd ../build
cmake ../athena/Projects/WorkDir
make -j
source ../build/x86_64-centos7-gcc11-opt/setup.sh
..and I try the sim command again, to see whether these files are now available.

EVERYTHING WORKED!

How to check where the sim hits are? I'd like to look at the raw coordinates, so can use the fact that pixel and strip hits coordinates are stored:

from ROOT import TFile
x = TFile('test.NEW.HITS.pool.root') 
myTree = x.Get("CollectionTree")
for event in myTree:
  print("NEW EVENT")
  print(event)
  obj=event.SiHitCollection_p3_ITkStripHits
  strips = list( zip(obj.m_hit1_x0, obj.m_hit1_y0,obj.m_hit1_z0))
  obj2=event.SiHitCollection_p3_ITkPixelHits
  pixels=list( zip(obj2.m_hit1_x0, obj2.m_hit1_y0,obj2.m_hit1_z0))
  print("STRIPS")
  print(strips)
  print("PIXELS")
  print(pixels)
x.Close()

Now I can see the raw hit coords and see whether anything changes when I change the material, without going to the reco step (yet).

Day 3 : Establishing Goals

I will be working on the Outer Pixel Services;

OuterPixelBarrel.jpg

lets start with outer barrel (layers 2,3,4). What does this consist of? Reference: https://indico.cern.ch/event/881276/attachments/2006375/3351061/ITk_Pixel_Outer_Barrel__Service_Description_19032020.pdf

  • 'Pigtails': flex circuits connecting the module to the PP0. Pigtails come in different varieties according to the orientation of the modules.
  • On-detector services : 'distributed PP0' including MOPS Dcs chips on the end and Type-1 connector for LV/HV/DCS, which plugs a cable to PP1
  • 'Type-1 services' : linking PP0 and PP1

Let's look for one item: outer pixel barrel PP0 and see what data we can find; the goal is to describe this in gemodelxml and introduce it into the simulation. edms reference: https://edms.cern.ch/ui/#!master/navigator/project?P:100557333:100766686:subDocs This is probably most relevant: https://edms.cern.ch/ui/#!master/navigator/document?P:1557173888:100770622:subDocs giving the PP0 stack-up and total masses per longeron (two each end for the flat section) or for each quarter-ring (inclined section).

OuterBarrelFlatPP0Rough.jpg

OuterBarrelInclinedPP0Schematic.jpg

Let's start with the flat one. Each longeron supports 36 modules, 18 per end (A or C), and the PP0 is divided into two parts _per end _: a 'short' PP0 which serves the central 6 on each side and a 'long' PP0 which serves the remaining 12 on each side. The PP0 services may be identified by the identifiers SP-{A or C}-{01 or 02} ; 01=central 6, 02=outer 12.

So I'll start with boxes for the short PP0 flexes.

lot of faffing ensues while I discover I still don't understand the overall dimensions

start poking in the xml files

look at the gdml output from Noemi

export GDML_FILE_NAME=/Users/sroe/Pixel.gdml 
gmex  /usr/local/lib/libGDMLtoGM.dylib

...Noemi has the pixel gdml output indicating the kind of granularity I need for the PP0 (and can be visualised in gmex). I think I can use this (with those dimensions) and progress further.

Day 4

spent faffing about with colours and trying to find services in the gdml

Looking again, it looks like the PP0 are actually just implemented as cylinders, and the module services are implemented as along the longeron, but separate from it; I think these are the pig tails joining the modules to the PP0.

Day 5

Looking again.. filtering the gdml to give me the PP0

<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:xd="http://www.oxygenxml.com/ns/doc/xsl" exclude-result-prefixes="xd" version="1.0">
    <xd:doc scope="stylesheet">
        <xd:desc>
            <xd:p><xd:b>Created on:</xd:b> Oct 14, 2021</xd:p>
            <xd:p><xd:b>Author:</xd:b> sroe</xd:p>
            <xd:p/>
        </xd:desc>
    </xd:doc>
    <xsl:output indent="yes" method="xml"/>
    <!-- default to no output -->
    <xsl:template match="@* | node()"/>
    <xsl:template match="gdml">
        <gdml xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xsi:noNamespaceSchemaLocation="http://service-spi.web.cern.ch/service-spi/app/releases/GDML/schema/gdml.xsd">
            <define/>
            <xsl:apply-templates/>
        </gdml>
    </xsl:template>
    <xsl:template match="materials">
        <materials>
            <xsl:copy-of select="isotope"/>
            <xsl:copy-of select="element"/>
            <!-- we need air to be defined -->
            <xsl:copy-of select="material[contains(@name, 'Air')]"/>
            <xsl:copy-of select="material[contains(@name, 'SvcBrlPP0')]"/>
        </materials>
    </xsl:template>
    <xsl:template match="structure">
        <structure>
            <xsl:copy-of select="volume[contains(@name, 'SvcBrlPP0')]"/>
            <xsl:apply-templates select="volume[contains(@name,'Pixel__Pixel0x')]"/>
            <xsl:copy-of select="volume[contains(@name, 'IDET__IDET')]"/>
        </structure>
    </xsl:template>
    <xsl:template match="setup">
        <xsl:copy-of select="."/>
    </xsl:template>
    <xsl:template match="volume">
        <volume name="{@name}">
            <xsl:copy-of select="materialref"/>
            <xsl:copy-of select="solidref"/>
            <!-- now need to select which elements you want -->
            <xsl:copy-of select="physvol[contains(@name, 'SvcBrlPP0')]"/>
        </volume>
    </xsl:template>
    <xsl:template match="solids">
        <solids>
            <xsl:copy-of select="*[contains(@name, 'SvcBrlPP0')]"/>
            <!-- we need the pixel container shape to be defined -->
            <xsl:copy-of select="*[contains(@name, 'Pixel0x')]"/>
            <xsl:copy-of select="*[contains(@name, 'IDET0x')]"/>
        </solids>
    </xsl:template>
</xsl:stylesheet>
(updated 24 November from original)

This results in a ~520 line file containing all the isotopes and elements, but only the solids and other materials required by the PP0. Note that the order of the elements matters; the referenced elements must be in the file before the element which refers to them. Presumably the gmex gdml parser is using the event driven SAX model and needs to find the referenced element already in memory. * PP0 elements only:
image_2021_10_15T12_52_27_404Z.png

Day 7

I then started to translate the gdml elements to GeoModelXML format, using the dtd as a guide and the extracted gdml as a source. I started to do this automatically, but for theis small number of elements it hardly seemed worth it, so I did them by hand. The created gmx file is complete and valid, and contains ~150 lines for the six elements (PP0 A & C for the three layers) and their materials.

* PP0 recreated in GeoModelXml:
PP0GeoModelXml.png

One peculiarity is that the GeoModelXML version has a visible gap between the innermost shells (A and C) whereas the gdml version does not (I think it should have, looking at the dimensions).

Day 8

Pasting the extracted gdml->gmx conversion, I see a small problem:

PP0_gmx_pasteback.png

The new outer cylinder representing PP0 clashes with the existing longerons, and the other PP0 layers are nowhere visible.

Day 9

After correcting some omissions:

PP0_gmx_pasteback2.png

how to reconcile this with pure gdml picture? :

gdml_only.png

Apparently this is a known feature; the services described as PP0 but implemented as an artificial construct of material smeared over a cylinder do not really exist, so clashes might be expected. However, to resolve this Noemi proposed simply moving the PP0 layers out in December 2020:

https://indico.cern.ch/event/976601/contributions/4125778/attachments/2154185/3633033/AUW_SW_20201202.pdf

(around page 11)

Recoloured and adjusted:

PP0_gmx_recoloured_ajusted.png

Of course, by doing this (and keeping the same material density) I have increased the overall mass very slightly. The non-clashing internal radii are:

  • 214.75 (same)
  • 282.5 (was 274.75)
  • 346.5 (was 334.75)

maintaining the thickness of 0.5 mm.

Day 10

Moving on to the Cooling. Noemi tells me the cooling in 21.9 was routed inside the longerons, which I do not see, so I decide to regenerate the gdml.
Sim_tf.py --inputEVNTFile "/afs/cern.ch/user/n/noemi/public/mc15_14TeV_single_mu_Pt100/*.pool.root.1" --maxEvents 1 --conditionsTag 'all:OFLCOND-MC15c-SDR-14-05' --randomSeed 'all:873254' --truthStrategy 'MC15aPlus' --imf 'all:False' --postInclude 'all:InDetSLHC_Example/postInclude.SLHC_Setup_ITK.py' 'EVNTtoHITS:BeamEffects/postInclude.CrabKissingVertexPositioner_Nominal.py' --preInclude 'all:InDetSLHC_Example/preInclude.NoTRT_NoBCM_NoDBM.py,InDetSLHC_Example/preInclude.SLHC.py,InDetSLHC_Example/preInclude.SLHC_Setup.py,InDetSLHC_Example/preInclude.SLHC_Setup_Strip_GMX.py,SimulationJobOptions/preInclude.CalHits.py,SimulationJobOptions/preInclude.ParticleID.py,InDetSLHC_Example/preInclude.VolumeDebugger.py' --DataRunNumber '242000' --postExec 'EVNTtoHITS:ServiceMgr.DetDescrCnvSvc.DoInitNeighbours=False;from AthenaCommon import CfgGetter; CfgGetter.getService("ISF_MC15aPlusTruthService").BeamPipeTruthStrategies+=["ISF_MCTruthStrategyGroupIDHadInt_MC15"];ServiceMgr.PixelLorentzAngleSvc.ITkL03D = True;' --geometryVersion 'all:ATLAS-P2-ITK-23-00-03_VALIDATION' --HGTDOn 'True'

...which generates a 200Mb file!! So I write a transform to extract the pixels:

<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:xd="http://www.oxygenxml.com/ns/doc/xsl" exclude-result-prefixes="xd" version="1.0">
    <xd:doc scope="stylesheet">
        <xd:desc>
            <xd:p><xd:b>Created on:</xd:b> Nov 16, 2021</xd:p>
            <xd:p><xd:b>Author:</xd:b> sroe</xd:p>
            <xd:p/>
        </xd:desc>
    </xd:doc>
    <xsl:output indent="yes" method="xml"/>
    <!-- default to no output -->
    <xsl:template match="@* | node()"/>
    <xsl:template match="gdml">
        <gdml xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xsi:noNamespaceSchemaLocation="http://service-spi.web.cern.ch/service-spi/app/releases/GDML/schema/gdml.xsd">
            <define/>
            <xsl:apply-templates/>
        </gdml>
    </xsl:template>
    <xsl:template match="materials">
        <xsl:copy-of select="."/>
    </xsl:template>
    <xsl:template match="solids">
    <solids>
    <xsl:copy-of select="*[not (contains(@name,'SCT__') or contains(@name,'HGTD__'))]"/>
    </solids>
    </xsl:template>
    <xsl:template match="structure">
    <structure>
    <xsl:copy-of select="*[not (contains(@name,'SCT__') or contains(@name,'HGTD__') or contains(@name,'IDET__'))]"/>
     <xsl:apply-templates select="volume[contains(@name, 'IDET__')]"/>
    </structure>
    </xsl:template>
    <xsl:template match="volume">
        <volume name="{@name}">
          <xsl:copy-of select="*[not (contains(@name,'SCT__') or contains(@name,'HGTD__'))]"/>
         </volume>
    </xsl:template>
    <xsl:template match="setup">
    <xsl:copy-of select="."/>
    </xsl:template>
</xsl:stylesheet>

..which reduces this to about 20Mb.

Day 11

I still dont see the cooling services inside the longeron, and want to compare the longeron positions. I can do this if I translate the longerons into geoModelXML from the gdml. The gdml elements which make up the longerons are some kind of planar extrusions, with element type xtru:
 <xtru lunit="mm" name="brlWall10x389bcf00">
      <twoDimVertex x="0" y="14.8"/>
      <twoDimVertex x="-19.0175417312032" y="14.8"/>
      <twoDimVertex x="-19.0175417312032" y="15.1"/>
      <twoDimVertex x="0" y="15.1"/>
      <section scalingFactor="1" xOffset="0" yOffset="0" zOrder="0" zPosition="-377"/>
      <section scalingFactor="1" xOffset="0" yOffset="0" zOrder="1" zPosition="377"/>
    </xtru>

What's the equivalent in GeoModelXML? Noemi pointed me to this repository which is clearly a more recent version of Nigel Hesseys original and contains more elements in the dtd for the XML:

  <!ELEMENT shapes ((box|cons|generictrap|para|pcon|pgon|trap|tube|tubs|trd|intersection|subtraction|union|simplepolygonbrep)+)>
    <!-- All shapes allowed in GeoModel manual. Same name, parameters, parameter order, but always lower case -->
...and then she also pointed me at this:

https://gitlab.cern.ch/GeoModelDev/GeoModel/-/blob/master/GeoModelTools/GDMLtoGM/src/xtruHandler.cxx

...which deals exactly with the xtru element and creates (in GeoModel) a SimplePolygoBrep. So now I know what to do, and can extract and transform the gdml to gmx using this algorithm in xslt directly (I hope)

Day 12

I'm seeing strange things in the extracted Pixel detector from the generated file, so decide to re-generate it, using a hint from Noemi to generate only the Pixel detector. With reference to the debugger code:

https://acode-browser1.usatlas.bnl.gov/lxr/source/athena/InnerDetector/InDetExample/InDetSLHC_Example/share/preInclude.VolumeDebugger.py?v=21.9

export TARGETVOLUME = Pixel::Pixel
Sim_tf.py --inputEVNTFile "/afs/cern.ch/user/n/noemi/public/mc15_14TeV_single_mu_Pt100/*.pool.root.1" --maxEvents 1 --conditionsTag 'all:OFLCOND-MC15c-SDR-14-05' --randomSeed 'all:873254' --truthStrategy 'MC15aPlus' --imf 'all:False' --postInclude 'all:InDetSLHC_Example/postInclude.SLHC_Setup_ITK.py' 'EVNTtoHITS:BeamEffects/postInclude.CrabKissingVertexPositioner_Nominal.py' --preInclude 'all:InDetSLHC_Example/preInclude.NoTRT_NoBCM_NoDBM.py,InDetSLHC_Example/preInclude.SLHC.py,InDetSLHC_Example/preInclude.SLHC_Setup.py,InDetSLHC_Example/preInclude.SLHC_Setup_Strip_GMX.py,SimulationJobOptions/preInclude.CalHits.py,SimulationJobOptions/preInclude.ParticleID.py,InDetSLHC_Example/preInclude.VolumeDebugger.py' --DataRunNumber '242000' --postExec 'EVNTtoHITS:ServiceMgr.DetDescrCnvSvc.DoInitNeighbours=False;from AthenaCommon import CfgGetter; CfgGetter.getService("ISF_MC15aPlusTruthService").BeamPipeTruthStrategies+=["ISF_MCTruthStrategyGroupIDHadInt_MC15"];ServiceMgr.PixelLorentzAngleSvc.ITkL03D = True;' --geometryVersion 'all:ATLAS-P2-ITK-23-00-03_VALIDATION' --HGTDOn 'True'
Even in this file, there is definitely something strange. A view down the beam pipe seems to show a line of modules.

ScrewyBeamPipeView.png

Day 13: Making a Merge Request

So I setup Athena (on lxplus) and did a clone of the repository.
git clone -b master-Pixel-to-GeoModelXML https://gitlab.cern.ch/sroe/ITKLayouts/

After making changes I can commit stuff (good) and do push to my fork. But trying to do an MR reveals many conflicts, apparently due to my fork being old. How to update it? Trying this:

git clone https://gitlab.cern.ch/sroe/ITKLayouts/
cd ITKLayouts/
git remote add upstream https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts
git fetch upstream
git merge upstream/main

did not work and ended up with many merge conflicts also. Can I do

git pull upstream main
? This also gives conflicts, e.g.
CONFLICT (content): Merge conflict in ITKLayouts/data/Strip/Barrel.gmx
How to simply set my fork to a copy of the master?

... so after several incantations involving fetch, pull, overwrite, merge and an abortive MR which ended up at the main repository, I decided to delete my ITKLayouts fork and start again. First set up athena as usual, then go to the web page

https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts

And click on the 'Fork' button. I'm presented with some options and I choose my own user space (Shaun Roe)

Now I have my own new repository at https://gitlab.cern.ch/sroe/ITKLayouts, but I guess I need to add something here in the settings, according to https://indico.cern.ch/event/966572/contributions/4067746/attachments/2124250/3576085/Continuous_Integration_Validation_for_ITKLayouts-1.pdf

So I go to 'settings' menu, CI/CD submenu and add the Variables CERN_USER and SERVICE_PASS.

Showing fork dialogue choices

Now start again, cloning the repository locally. This doesn't work:

git clone -b master-Pixel-to-GeoModelXML https://gitlab.cern.ch/sroe/ITKLayouts/

... but this does:

git clone https://gitlab.cern.ch/sroe/ITKLayouts/
(with warnings about not being able to open a display, and prompting for my username and password) ..probably I forgot something, related to setting 'upstream' ?

git remote add upstream https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts
git fetch upstream

...is ok. 'git status' shows

On branch main
Your branch is up to date with 'origin/main'.

nothing to commit, working tree clean

now try

git checkout -b main_PP0_trial upstream/main --no-track

..which seems to work. Now I make changes to the materials.gmx file and commit those changes. Now push:

git push --set-upstream origin main_PP0_trial
(again with asking about my user name and password every time ) and I use the returned http address to generate an MR, which seems to go successfully.

AH! The returned link is to make an MR to my own repository, not the main one.

To do that, I go back to my repository and find the branch I created and do a 'merge request' from there, selecting the target to be Atlas-Inner-Tracking/ITKLayouts. Not sure why this is different between Athena and the ITk repository. I also get a scary warning about 'this is a merge request from a private repository to an internal project'. Not sure why that is. Finally my MR made it in: https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts/-/merge_requests/207

Day 14:Clash detection

Unfortunately my MR introduced some clashes, which were subsequently detected by Nick using 'gmclash'. This runs on a mac, but there's no binary for our linux system. So I'm going to be working locally and see how that works, first starting from a checkout of the ITKLayouts.

export LANG=en_US.UTF-8
export LD_LIBRARY_PATH=/Users/sroe/install/lib/:$LD_LIBRARY_PATH
cd  /Users/sroe/Documents/Physics/HEP/Atlas/InnerDetector/ITK/ITKLayouts/ITKLayouts/data/Pixel
ln -s /Users/sroe/Documents/Physics/HEP/Atlas/InnerDetector/ITK/ITKLayouts/ITKLayouts/data/Pixel/ITkPixel.gmx gmx.xml
gmex /Users/sroe/install/lib/libGMXPlugin.dylib
Note that (for some reason), I had to re-install (via brew) geomodel-thirdparty-geant4; the G4expat library was previously missing.

running gmclash

gmclash -g/usr/local/lib//libGMXPlugin.dylib
I can see the clashes. Moving the outer PP0 cylinder in by 0.5mm appears to solve this; submitted MR https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts/-/merge_requests/209

Going back to the gdml I was previously looking at, there is still clearly a bug there worth isolating and reporting. The extra elements I see in the centre of the detector are from Pixel__brlWall10x3853ec70, which is one side wall of a longeron. Ill try deleting some of the other elements to see whether I can provide a minimal example.

Day 15: Buggy GDML?

I cut down the original GDML to just a few elements and I can see there is a longeron wall inserted right down the middle of the detector. Noemi doesnt see this (using the same file) in her version of gmex, so it appears to be a bug in the brew installation on the mac. Filed a Jira ticket: https://its.cern.ch/jira/browse/ATLASDD-44.

Day 16: Recompiling from source

So I tried following the instructions on https://gitlab.cern.ch/GeoModelDev/GeoModel/-/tree/master/documentation/docs/dev, and it seems that I succeeded. However when I view that gdml, I still see the same problem of modules right down the beam pipe. Perhaps I'm using an old version of the libraries still, and need to uninstall them from brew?

Day 17: Returning to task

I would like to work locally on my mac, so from my ITK directory I do:
git clone https://gitlab.cern.ch/sroe/ITKLayouts/
cd ITKLayouts
git remote add upstream https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts
git pull upstream main
git merge upstream/main
git push --set-upstream origin main
..which seems to have updated my fork, and I have the local repo in my ITk directory.

days 18, 19: Inclined section PP0

Extraction of just the inclined section gdml done fairly easily. Need to do piece by piece; extract the materials first.

Days 20-22

Managed to push MR for the inclined section PP0, and implement the suggestions. (also screwed up my repo in the process, but seem to have recovered)

Day 23: Barrel Module Services

Managed to extract the Barrel Module services from the gdml into a standalone gdml.

BrlModuleSvcExtractionImage_21_9.png

Day 24: Rationalising material

The standalone gdml extracted from the Pixel 21.9 gdml file contains a lot of redundancy; materials with different names are ostensibly the same material but are referenced by their various names in the materialref. I used Muenchian grouping in XSLT to produce just one material for each density (considering this to be a unique identifier for each type of material) and ensure the materialref references are consistent with this.
<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:xd="http://www.oxygenxml.com/ns/doc/xsl" exclude-result-prefixes="xd" version="1.0">
    <xd:doc scope="stylesheet">
        <xd:desc>
            <xd:p><xd:b>Created on:</xd:b> Mar 9, 2022</xd:p>
            <xd:p><xd:b>Author:</xd:b> sroe</xd:p>
            <xd:p>extract unique materials, selected by density</xd:p>
        </xd:desc>
    </xd:doc>
    <xsl:output indent="yes" method="xml"/>
    <!-- default to no output -->
    <xsl:template match="@* | node()"/>
    <xsl:key name="materials_by_density" match="material" use="D/@value"/>
    <xsl:template match="gdml">
        <gdml xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xsi:noNamespaceSchemaLocation="http://service-spi.web.cern.ch/service-spi/app/releases/GDML/schema/gdml.xsd">
            <define/>
            <xsl:apply-templates/>
            <setup name="Default" version="1.0">
                <world ref="Pixel__Pixel0x379617b0"/>
            </setup>
        </gdml>
    </xsl:template>
    <xsl:template match="materials">
        <materials>
            <xsl:copy-of select="isotope"/>
            <xsl:copy-of select="element"/>
            <!-- <xsl:copy-of select="material[contains(@name, 'Air')]"/> -->
            <xsl:for-each
                select="material[generate-id() = generate-id(key('materials_by_density', D/@value)[1])]">
                <material name="{key('materials_by_density', D/@value)[1]/@name}" state="{@state}">
                    <xsl:for-each select="*">
                        <xsl:copy-of select="."/>
                    </xsl:for-each>
                </material>
            </xsl:for-each>
        </materials>
    </xsl:template>
    <xsl:template match="solids">
        <solids>
            <xsl:for-each select="*">
                <xsl:copy-of select="."/>
            </xsl:for-each>
        </solids>
    </xsl:template>
    <xsl:template match="structure">
        <structure>
            <xsl:for-each select="volume">
                <volume name="{@name}">
                    <xsl:variable name="thisRef" select="./materialref/@ref"/>
                    <materialref
                        ref="{key('materials_by_density', //material[@name = $thisRef]/D/@value)[1]/@name}"/>
                    <solidref ref="{solidref/@ref}"/>
                    <xsl:copy-of select="physvol"/>
                </volume>
            </xsl:for-each>
        </structure>
    </xsl:template>

</xsl:stylesheet>

Rationalising shapes

The boxes have the same problem, with differently referenced boxes ahving the same x,y,z coordinates. Rationalising these, on top of the previous rationalisation, is a little more cumbersome but still possible:
<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:xd="http://www.oxygenxml.com/ns/doc/xsl" exclude-result-prefixes="xd" version="1.0">
    <xd:doc scope="stylesheet">
        <xd:desc>
            <xd:p><xd:b>Created on:</xd:b> Mar 9, 2022</xd:p>
            <xd:p><xd:b>Author:</xd:b> sroe</xd:p>
            <xd:p>extract unique boxes, selected by x y z</xd:p>
        </xd:desc>
    </xd:doc>
    <xsl:output indent="yes" method="xml"/>
    <!-- default to no output -->
    <xsl:template match="@* | node()"/>
    <xsl:key name="boxes-by-position" match="box" use="concat(@x,@y,@z)"/>
    <xsl:template match="gdml">
        <gdml xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xsi:noNamespaceSchemaLocation="http://service-spi.web.cern.ch/service-spi/app/releases/GDML/schema/gdml.xsd">
            <define/>
            <xsl:apply-templates select="materials"/>
             <xsl:apply-templates select="solids"/>
            <xsl:apply-templates select="structure"/>
            <setup name="Default" version="1.0">
                <world ref="Pixel__Pixel0x379617b0"/>
            </setup>
        </gdml>
    </xsl:template>
    <xsl:template match="materials">
        <materials>
            <xsl:copy-of select="isotope"/>
            <xsl:copy-of select="element"/>
            <xsl:copy-of select="material"/>
        </materials>
    </xsl:template>
    <xsl:template match="solids">
        <solids>
            <xsl:for-each select="tube">
                <xsl:copy-of select="."/>
            </xsl:for-each>
            <xsl:for-each select="box[generate-id() = generate-id(key('boxes-by-position', concat(@x,@y,@z))[1])]">
            <xsl:copy-of select="."/>
            </xsl:for-each>
        </solids>
    </xsl:template>
    <xsl:template match="structure">
        <structure>
            <xsl:for-each select="volume[not(contains(materialref/@ref ,'Air'))]">
                <volume name="{@name}">
                    <xsl:variable name="thisRef" select="./solidref/@ref"/>
                     <materialref ref="{materialref/@ref}"/>
                    <solidref
                        ref="{key('boxes-by-position', concat(//box[@name = $thisRef]/@x, //box[@name = $thisRef]/@y,//box[@name = $thisRef]/@z))[1]/@name}"/>
                   
                    <xsl:copy-of select="physvol"/>
                </volume>
            </xsl:for-each>
             <xsl:for-each select="volume[contains(materialref/@ref ,'Air')]">
              <xsl:copy-of select="."/>
            </xsl:for-each>
        </structure>
    </xsl:template>

</xsl:stylesheet>

It results in a 49 453 line XML, whereas the original (before both rationalisations) was 69 724; a ~20k line decrease. Checking the display shows the same structure.

Day 25: Killing orphans

The resulting XML had a lot of orphaned shapes and volumes; i.e. they had no corresponding solidref or volumref, so i remove these:
<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:xd="http://www.oxygenxml.com/ns/doc/xsl" exclude-result-prefixes="xd" version="1.0">
    <xd:doc scope="stylesheet">
        <xd:desc>
            <xd:p><xd:b>Created on:</xd:b> Mar 9, 2022</xd:p>
            <xd:p><xd:b>Author:</xd:b> sroe</xd:p>
            <xd:p>remove orphaned volumes</xd:p>
        </xd:desc>
    </xd:doc>
    <xsl:output indent="yes" method="xml"/>
    <!-- default to no output -->
    <xsl:template match="@* | node()"/>
    <xsl:key name="boxes-by-position" match="box" use="concat(@x, @y, @z)"/>
    <xsl:template match="gdml">
        <gdml xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xsi:noNamespaceSchemaLocation="http://service-spi.web.cern.ch/service-spi/app/releases/GDML/schema/gdml.xsd">
            <define/>
            <xsl:copy-of select="materials"/>
            <xsl:apply-templates select="solids"/>
            <xsl:apply-templates select="structure"/>
            <setup name="Default" version="1.0">
                <world ref="Pixel__Pixel0x379617b0"/>
            </setup>
        </gdml>
    </xsl:template>
    <xsl:template match="materials">
        <materials>
            <xsl:copy-of select="isotope"/>
            <xsl:copy-of select="element"/>
            <xsl:copy-of select="material"/>
        </materials>
    </xsl:template>

    <xsl:template match="structure">
        <structure>
            <xsl:for-each select="volume[//volumeref/@ref = @name]">
                <xsl:copy-of select="."/>
            </xsl:for-each>
            <volume name="Pixel__Pixel0x379617b0">
                <materialref ref="Air0x34fbb180"/>
                <solidref ref="Pixel0x37926d20"/>
                <physvol name="av_420_impr_1_Pixel__Layer2_pv_2_Layer20x38f37270">
                    <volumeref ref="Pixel__Layer20x3705eb40"/>
                </physvol>
                <physvol name="av_420_impr_1_Pixel__Layer3_pv_3_Layer30x38f372c0">
                    <volumeref ref="Pixel__Layer30x37fe3b00"/>
                </physvol>
                <physvol name="av_420_impr_1_Pixel__Layer4_pv_4_Layer40x38f37310">
                    <volumeref ref="Pixel__Layer40x37e57710"/>
                </physvol>
            </volume>
        </structure>
    </xsl:template>
    <xsl:template match="solids">
        <solids>
            <xsl:for-each select="*[//solidref/@ref = @name]">
                <xsl:copy-of select="."/>
            </xsl:for-each>
            
        </solids>
    </xsl:template>

</xsl:stylesheet>

Day 30: Completing the Svc Layers

I managed to rationalise the material definitions and finally described the services in GeoModelXml for Layers 2, 3, 4 of the Barrel. What are the next targets? Looking at the grouped histograms from 21.9:
############materialsTable#############
SvcBrlT1                                                                    adds 0.000000 (eta=0.0)| 0.007207 (eta=1.0)| 0.142807 (eta=2.0)| 0.000000 (eta=3.0)| 0.000000 (eta=4.0)  in x0
SvcEc                                                                       adds 0.000000 (eta=0.0)| 0.000277 (eta=1.0)| 0.086769 (eta=2.0)| 0.000000 (eta=3.0)| 0.000000 (eta=4.0)  in x0
SvcEcT0TwdBS                                                                adds 0.000000 (eta=0.0)| 0.000000 (eta=1.0)| 0.003522 (eta=2.0)| 0.047532 (eta=3.0)| 0.000000 (eta=4.0)  in x0
SvcEcT1                                                                     adds 0.000000 (eta=0.0)| 0.000000 (eta=1.0)| 0.001314 (eta=2.0)| 0.132485 (eta=3.0)| 0.000000 (eta=4.0)  in x0
SvcEccooling                                                                adds 0.000000 (eta=0.0)| 0.000294 (eta=1.0)| 0.017464 (eta=2.0)| 0.000000 (eta=3.0)| 0.000000 (eta=4.0)  in x0
pix::PP1_T2_R180_R347_InnerCooling_Fixed_Weight                             adds 0.000000 (eta=0.0)| 0.000000 (eta=1.0)| 0.000000 (eta=2.0)| 0.013564 (eta=3.0)| 0.000000 (eta=4.0)  in x0
Brl__M                                                                      adds 0.014887 (eta=0.0)| 0.028474 (eta=1.0)| 0.000000 (eta=2.0)| 0.000000 (eta=3.0)| 0.000000 (eta=4.0)  in x0
pix::Chip_FourChip_PlusTPGCooling                                           adds 0.030004 (eta=0.0)| 0.045043 (eta=1.0)| 0.033722 (eta=2.0)| 0.000000 (eta=3.0)| 0.000000 (eta=4.0)  in x0
PP0                                                                         adds 0.051067 (eta=0.0)| 0.081320 (eta=1.0)| 0.041713 (eta=2.0)| 0.000000 (eta=3.0)| 0.000000 (eta=4.0)  in x0
########################################
Total                                                                            0.095958 (eta=0.0)| 0.162615 (eta=1.0)| 0.327311 (eta=2.0)| 0.193580 (eta=3.0)| 0.000000 (eta=4.0) 

I would say that the T1 (transition regions?) deserve attention. I'll start with the barrel, in the same way as before: From the gdml, extract the Barrel T1 services.

Coming back: November 23

Two services remain to be implemented, InnerCooling_Fixed_Weight and FourChip_PlusTPGCooling. In the meantime, the ITk workflow has changed somewhat (the authentication process), so I'll first start with that.

My naïve

git clone -b main https://:@gitlab.cern.ch:8443/Atlas-Inner-Tracking/ITKLayouts.git 
into a local area (on my machine) immediately fails with
HTTP Basic: Access denied.
My various attempts at updating my fork from my local machine failed. Lets try from lxplus. Again, a simple
 git clone https://gitlab.cern.ch/sroe/ITKLayouts/
failed due to access being denied. I suspect this is due to the new authentication method, but I don't yet know how to change my fork or area to solve it. I try deleting the CI/CD user name and password variables in my fork, using https://gitlab.cern.ch/Atlas-Inner-Tracking/ITKLayouts/-/merge_requests/270/diffs as a guide. I discovered I was entering the password instead of username in the GUI when I clone. With the correct variables, it works. I also deleted those user name variables in the CI/CD settings of my fork. Again, I get merge conflicts when I attempt a merge after a fetch. Again, it might be easier to delete my fork and start again.

How did I delete my fork previously?? In the Settings menu item there is General-> Advanced -> Delete. After various safety measures, I successfully deleted the project Fork.

Dec 6

After some time off, trying to filter out the 21.9 gdml into the inner cooling fixed weight parts only. At first it seems I get only the mother volumes with no components.. After manipulating a bit I get some segments, and then I try to identify them in the 21.9 gdml, here:
InnerCoolingFixedWeightBit.png: InnerCoolingFixedWeightBit.png

May 25 2023: gmex on Mac M1

starting up gmex using
export GDML_FILE_NAME=/Users/sroe/Pixel.gdml
gmex  /usr/local/lib/libGDMLtoGM.dylib
results in a segmentation fault. The missing magic ingredient is this:
export DYLD_LIBRARY_PATH=/usr/local/lib:/opt/homebrew/lib

Using VSCode for Athena development

Introduction

I am typically using a Mac and SSH into lxplus or a more performant machine at CERN. I usually use a sparse checkout of Athena and have set up a script which does single-button compilation in the 'Terminal' application. As an editor, I use BBEdit (I paid for it) which has a few language server capabilities (enabling type-ahead etc) but I can't use many of them easily with Athena. I would like to use an IDE, and following the presentation in the software weekly, https://indico.cern.ch/event/1280493/, decided to try with vscode.

Setting up

So I downloaded vs code for mac, from here, and then downloaded the remote development module. This opens up vscode and installs itself from the link.

The instructions on the software page say "For this reason they always require .vscode to be present in the sparse checkout". What does this mean in practice?

Summer Students Project Coordination

I am the ATLAS summer students projects coordinator. This is a pretty annoying task due to the way the information flow is designed. The job consists of the following:
  • Sending out the call for summer students projects
  • Collecting the information about the projects and prioritising them
  • Submitting the priorities to the summer student team
  • For Non-Member States students, requesting that Atlas/CERN fund some positions (usually four positions)
  • Selecting or requesting the remaining unoccupied projects to be allocated for the NMS students

There is one problem: Once the projects are submitted I have no access to the information; this is now provided in a spreadsheet sent by mail on the day after the deadline. Then I have no access to the information about which projects are filled or how. I am sometimes asked about whether a particular student has been given a project, or whether a particular project has been filled. Sometimes I am asked, as is reasonable, about overall statistics of the ATLAS success rate for projects. Again, I have no access to this information. This also makes the assignment of NMS students difficult.

How do I rate projects? I'll admit that this is a very rough assessment, I try to rank in five categories:

  1. Is it specific?
  2. Is it of broad educational value to the student (inside and outside HEP)?
  3. Is it achievable in the time given?
  4. Is it useful to ATLAS?
  5. How good is the submission?

the categories 1-4 are given points out of 5, the last one, out of 3, is my "pique" measure of whether the submitter obeyed the instructions given in the email call for projects.

Renewing a Grid certificate

Create a new certificate from: https://ca.cern.ch/ca/user/Request.aspx?template=EE2User

and download it in safari; it will be added to the keychain. From the KeyChain, export the certificate as a p12 file. Having downloaded the .p12 certificate file, create the key and certificate on linux by using:

openssl pkcs12 -in myCertificate.p12 -clcerts -nokeys -out ~/.globus/usercert.pem
openssl pkcs12 -in myCertificate.p12 -nocerts -out ~/.globus/userkey.pem
delete the p12 file.

Extracting one event from a bytestream file

AtlCopyBSEvent

AtlCopyBSEvent -e 345722018 -o event345722018.data data18_13TeV.00357750.physics_Main.daq.RAW._lb0086._SFO-4._0006.data

Usage:

usage: AtlCopyBSEvent [-d --deflate] -e [--event] <eventNumbers> [-r, --run <runnumber>] [-l, --listevents] [-t --checkevents] -o, --out outputfile inputfiles....
or using TAG collections: 
usage: AtlCopyBSEvent [-d --deflate] -s, --src <input collection name> <input collection type> [-x --srcconnect <input database connection string>] [-q --query <predicate string>] [-c --catalog <file catalogs>] -o, --out outputfile
eventNumbers is a comma-separated list of events

AtlListBSEvents

usage: AtlListBSEvents [-s, --showsize] [-c, --checkevents] [-l, --listevents] [-m, --maxevents] files ...

AtlFindBSEvent

usage: AtlFindBSEvent -e [--event] [-r, --run ] [-c, --checkevents] [-l, --listevents] files ...

xAODDigest.py

xAODDigest.py myAOD.pool.root

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng BrlModuleSvcExtractionImage_21_9.png r1 manage 135.2 K 2022-03-04 - 14:53 ShaunRoe  
PNGpng Fork.png r1 manage 107.0 K 2021-11-30 - 14:32 ShaunRoe  
PNGpng InnerCoolingFixedWeightBit.png r1 manage 80.6 K 2022-12-05 - 14:12 ShaunRoe  
JPEGjpg OuterBarrelFlatPP0Rough.jpg r1 manage 111.0 K 2021-09-01 - 13:34 ShaunRoe OuterBarrel Flat pp0
JPEGjpg OuterBarrelInclinedPP0Schematic.jpg r1 manage 89.7 K 2021-09-01 - 13:38 ShaunRoe Outer Barrel Inclined PP0
JPEGjpg OuterPixelBarrel.jpg r1 manage 33.8 K 2021-09-01 - 13:08 ShaunRoe Outer Pixel Barrel
PNGpng PP0GeoModelXml.png r1 manage 265.9 K 2021-10-21 - 14:07 ShaunRoe PP0 recreated in GeoModelXml
JPEGjpg PP0Only.jpg r1 manage 156.2 K 2021-09-03 - 17:05 ShaunRoe PhysVols with 'brl' in the element name
PNGpng PP0_gmx_pasteback.png r1 manage 121.8 K 2021-11-09 - 15:00 ShaunRoe Pasting back the PP0 from gdml to gmx
PNGpng PP0_gmx_pasteback2.png r1 manage 85.5 K 2021-11-10 - 11:26 ShaunRoe Pasting PP0 back from gdml to gmx
PNGpng PP0_gmx_recoloured_ajusted.png r1 manage 81.9 K 2021-11-10 - 16:40 ShaunRoe Recoloured and radius-adjusted PP0 in gmx
PNGpng ScrewyBeamPipeView.png r1 manage 191.0 K 2021-11-25 - 15:08 ShaunRoe Something wrong in beampipe view
PNGpng gdml_only.png r1 manage 108.0 K 2021-11-10 - 12:26 ShaunRoe gdml in 21.9, showing PP0
HTMLhtml google902c7b7aa230503e.html r1 manage 0.1 K 2010-09-04 - 17:29 ShaunRoe  
PNGpng image_2021_10_15T12_52_27_404Z.png r1 manage 191.4 K 2021-10-15 - 15:00 ShaunRoe PP0 elements only
Edit | Attach | Watch | Print version | History: r93 < r92 < r91 < r90 < r89 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r93 - 2023-05-25 - ShaunRoe
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback