Online Pixel Readout and Monitoring

Below is information and instructions for running the GNAM-based pixel code.

SR1 (tdaq5)

Here are instructions for setting up and compiling a working area on the SR1 pixel computers using tdaq5.

Checking out the code:

Everything should be installed and run from the ATLAS Pixel computers in SR1. You must have an online account, so contact Karolos if you need to get an account.

Log in to the online computer:
ssh -Y atlpix01.cern.ch (this will actually take you to analysis2)

Get a kerberos ticket:
kinit

Get the code (if you don't have permission to get the code, you will need to ask Karolos to give it to you):

git clone https://${USER}@git.cern.ch/reps/atlaspixeldaq

Compilation expects everything to be in ~/daq/:
ln -s `pwd`/atlaspixeldaq ~/daq
cd ~/daq

Checkout the tdaq5 branch (includes the Gnam IBL updates):
git checkout feature/migrationToTdaq5

Check that you are on the right branch:
git branch
The output should looks something like this:

* feature/migrationToTdaq5
  master

For more information on using git, here is the atlas pixel git tutorial.

Compiling:

You must be on one of the scl6 computers with tdaq5. I have tried everything on analysis9, but Karolos tells me that analysis6, analysis7, and analysis8 should also work.

ssh -Y analysis9

cd ~/daq/

make -j 1

Note: The reason to specify to use only a single thread (-j 1) is because the default behavior is to compile in multiple threads, but this will fail most of the time (though not all of the time) do to a race condition. Unfortunately, compiling this way is slow...

Troubleshooting:

If you get an error like:

install: cannot create regular file `/mnt/pixel_mnt/home/username/atlaspixeldaq/Applications/PixRCD/PixRCD-00-01-00/installed/share/data/genconfig/java/PixRCDConfiguration/__AnyObject__.java': File exists
Then try manually removing that file and compiling again.

SR1 (before 2014 July 1)

Below are instructions for setting up a working area on the SR1 pixel computers and then "run" on a ROD. Note, however, that you will probably not be able to run because that also requires the system in SR1 to be configured to allow that (a ROD must be configured to send simulated data, etc.). Running has not worked since Dec. 2013. On the other hand, the instructions for setting up and compiling the working area should.

Checking out the code:

Everything should be installed and run from the ATLAS Pixel computers in SR1. You must have an online account, so contact Karolos if you need to get an account.

Log in to the online computer:
ssh -Y atlpix01.cern.ch (this will actually take you to analysis2)

Get a kerberos ticket:
kinit

Get the code (if you don't have permission to get the code, you will need to ask Karolos to give it to you):

git clone https://${USER}@git.cern.ch/reps/atlaspixeldaq

Compilation expects everything to be in ~/daq/:
ln -s `pwd`/atlaspixeldaq ~/daq
cd ~/daq

Checkout the branch with the Gnam IBL updates:
git checkout feature/pixRCDMonitoring

Check that you are on the right branch:
git branch
The output should looks something like this:

* feature/pixRCDMonitoring
  master

For more information on using git, here is the atlas pixel git tutorial.

Old instructions for checking out the code (DEPRECATED, don't do it this way)

mkdir daq_120913 (this name can be whatever you want, but it's good to differentiate in case you want more than one version of code checked out)

ln -s daq_120913 daq (now link this to "daq" to have the default directory structure)

cd daq

svn ls $SVNROOT (SVNROOT = svn+ssh://svn.cern.ch/reps/atlaspixeldaq)

svn co $SVNROOT/trunk/VmeInterface

svn co $SVNROOT/trunk/RodDaq

svn co $SVNROOT/trunk/Applications

svn co $SVNROOT/trunk/QTaddons

Compiling:

cd ~/daq/

make pixlib

Old instructions for compiling (DEPRECATED, don't do it this way):

Make sure you are on analysis4 and then:

cd ~/daq/Applications/Pixel/Scripts

./Compile.sh

cd ~/daq/Applications/Pixel/PixRCD/PixRCD-00-01-00/PixRCDConfiguration/SR1

svn update

cd ~/daq/Applications/Pixel/Scripts

./FixXml.sh

Troubleshooting:

  • If it fails because it can't find include files because it is looking in Applications/Pixel/PixLib for the headers that are actually in Applications/PixLib, this is because you have a directory ~/daq/Applications/Pixel/ that you should not have if you got the code with git. The problem happens because there's a log in script checking for ~/daq/Applications/Pixel, and if it is not there it sets the include path correctly to ~/daq/Applications/, but if it is there it set the path incorrectly to ~/daq/Applications/Pixel/. The solution is remove that directory (~/daq/Applications/Pixel/) and then to source the setup script /daq/slc5/zzzz_daq.sh again, or logout and login again.

  • If you get an error that it cannot find -lCmdPattern, then you need to do the following and then try to make pixlib again:

    make -C ~/daq/RodDaq/IblUtils/HostCommandPattern/

    cd ~/daq/Applications/PixLib

    ln -s ../../RodDaq/IblUtils/HostCommandPattern/lib/libCmdPattern.so

    This happens because libCmdPattern was in transition in this release of the code and the pixel tdaq experts haven't finalized the new release so that it goes away.

Running on a ROD:

This requires a ROD in SR1 to be configured to send internally simulated data as well as other configurations to be set properly. This hasn't actually worked since December 2013, but the instructions may still be useful and maybe it will work again at some point.

Starting the GUIs:

  1. Check which ipc partition infrastructures already exist:
    ipc_ls -P
    You should get something like:
    initial
            PixelDD_moretti
            PixelInfr
            PixelInfr_karolos
            PixelInfr_moretti 
  2. Make sure you see PixelInfr. If it is not there, contact an expert to start it.
  3. Check if the base partition infrastructure for your user is running ( PixelInfr_jhaley, where jhaley is replaced by your username)
    • If it is not running, the following command will create your PixelInfr partition and open its GUI: start_infr
    • If PixelIfnr_jhaley is already running, then do the following to only open its GUI: Igui_start -p PixelInfr _jhaley
  4. In your PixelInfr GUI: BOOT (this takes some time), INITIALIZE, CONFIG, START
  5. Start a DD (Data Driver) partition infrastructure and GUI:
    start_dd
  6. In the DD GUI: BOOT, INITIALIZE, CONFIG (this takes some time), START

Stopping the GUIs:

  1. In your DD GUI: STOP, UNCONFIG, TERMINATE, SHUTDOWN. Then select: File > Exit.
  2. In your PixelInfr GUI: STOP, UNCONFIG, TERMINATE, SHUTDOWN. Then select: File > Exit.
When exiting, you will be asked if you want to also shutdown that partition infrastructure. If you stopped the GUI without shutdown the partition and you later want to kill the partition, you can do that from the command line with the = pmg_kill_partition = command: pmg_kill_partition -p PixelInfr _jhaley

Offline (lxplus5)

Below are instructions for setting up and compiling an offline working area. The current (pre-2014) pixel code was written for 32-bit SLC5 machines, so you must use an SLC5 lxplus node (lxplus5).

Getting the code:

Everything should be installed and run SLC5 nodes, accessible via lxplus5.

Log on to an SLC5 node:
ssh -Y lxplus5.cern.ch

Get a kerberos ticket:
kinit

Get the code (if you don't have permission to get the code, you will need to ask Karolos to give it to you):

git clone https://${USER}@git.cern.ch/reps/atlaspixeldaq

Compilation expects everything to be in ~/daq/:
ln -s `pwd`/atlaspixeldaq ~/daq
cd ~/daq

Checkout the branch with the Gnam IBL updates:
git checkout feature/pixRCDMonitoring

Check that you are on the right branch:
git branch
The output should looks something like this:

* feature/pixRCDMonitoring
  master

For more information on using git, here is the atlas pixel git tutorial.

Setting up the environment on lxplus5

Before compiling, you must setup the environment to find the correct headers and libraries. To accomplish this, I have made a modified "zzzz_daq.sh" script, which is based on the one used to setup the environment when running on SR1 machines, but modified for lxplus5 machines. Get a copy of this script from my public area:
cd ~/daq/
cp ~jhaley/public/pixel/zzzz_daq-x86_64-slc5-m32_2014June02.sh .
ln -s zzzz_daq-x86_64-slc5-m32_2014June02.sh setup_env.sh
This was made for 64-bit SLC5 nodes, but will compile in 32-bit mode (required by pixel code).

Now source that file to setup the environment:
source setup_env.sh
The output should look like:

/afs/cern.ch/user/j/jhaley/daq
source /afs/cern.ch/atlas/project/tdaq/inst/CMT/v1r22/mgr/setup.sh
source /afs/cern.ch/atlas/project/tdaq/inst/tdaq/tdaq-04-00-01/installed/setup.sh
Setting up TDAQ Common SW release "tdaq-common-01-18-04"
Setting up DQM Common SW release "dqm-common-00-18-03"
Setting up DAQ SW release "tdaq-04-00-01"
source /afs/cern.ch/user/j/jhaley/daq/Applications/Scripts/SetPartNames.sh
Error: RTEMS is not installed
rems gcc not found

The errors are because there are some things I left in script from SR1 that are not available on lxplus5 nodes. Maybe this should be cleaned up, but I left it for now.

Compiling

Now you can try compiling:
make pixlib

This will probably fail with an error that it cannot find -lCmdPattern and you will need to do the following and then try to make pixlib again:
make -C ~/daq/RodDaq/IblUtils/HostCommandPattern/ (This ends with an error while compiling some "test," but the library should have been made.)
cd ~/daq/Applications/PixLib
ln -s ../../RodDaq/IblUtils/HostCommandPattern/lib/libCmdPattern.so
cd ~/daq
make pixlib
This happens because libCmdPattern was in transition in this release of the code and the pixel tdaq experts haven't finalized the new release so that it goes away.

Generally useful stuff:

Facts and figures comparing 3-layer Pixel and IBL

3-Layer Pixel System: (PIXROD)  
Layer/Disk Staves/Sectors DAQ Modules Optoboards ROD/BOC ROBin ROS
Layer 0 22 286 44 44    
Layer 1 38 494 76 38    
Layer 2 52 676 104 26    
Disk 48 288 96 24    
Total 160 1744 272 132 44+ 12
IBL System: (IBLROD)  
IBL 14 224 28 14 19 5
Grand TOTAL 174 1968 300 132 63+ 17

Most useful commands:

If the code is already installed (see above), these are the most useful command for running:

ssh -Y atlpix01.cern.ch
kinit
ssh -Y analysis4

ipc_ls -P

start_infr

Igui_start -p PixelInfr_jhaley

start_dd

Some notes on commands:

  • The tag manager is a GUI that sets some parameters for the DAQ (TODO: understand this). Most fields have only one option, so it is clear what they should be, but one near the end needed to be "C0_S20" because the ROD was configured to produce simulated data for that module (?). (This was a temporary setup that will likely change depending on the situation, so probably need to know more from experts.)
TagManager

Commands for ipc partitions:

  • List all current partitions:
ipc_ls -P
  • List info for a given partition:
ipc_ls -p PixelInfr _jhaley -l
  • Kill a given partition:
pmg_kill_partition -p PixelInfr _jhaley

The code

The online pixel code consists of plugins for the GNAM framework. You can find some general information on the GNAM framework here:

The GNAM plugins for the pixel system are located in Applications/Pixel/PixRCD/PixRCD-00-01-00/PixRCDMonitoring/. There are three main components:
  1. PixelDecode.cxx: Decodes pixel and IBL ROD fragments and stores the information in a PixelEvent object.
  2. PixelEvent.cxx/.h: Holds the decoded information for a given event.
  3. PixelHisto.cxx: Creates and files histograms from the information in a PixelEvent object, which are published to the Online Histogram Service (OHS).

Data format

-- Main.JosephHaley - 01 Jun 2014

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng Comparison_3LayerPixel_IBL.png r1 manage 204.9 K 2014-07-11 - 17:33 JosephHaley  
JPEGjpg DAQ_Connections_3LayerPixel.jpg r1 manage 450.1 K 2014-07-11 - 17:15 JosephHaley  
JPEGjpg DAQ_Connections_IBL.jpg r1 manage 342.6 K 2014-07-11 - 17:15 JosephHaley  
JPEGjpg DAQ_Connections_IBL_updated.jpg r1 manage 1090.3 K 2015-03-19 - 17:48 JosephHaley  
JPEGjpg DBM_layout.jpg r1 manage 95.7 K 2014-07-11 - 17:54 JosephHaley  
JPEGjpg IBL_layout.jpg r1 manage 148.2 K 2014-07-11 - 17:54 JosephHaley  
PNGpng Module_3LayerPixel.png r1 manage 1253.2 K 2014-07-11 - 17:33 JosephHaley  
PNGpng Overview_3LayerPixel.png r1 manage 1074.9 K 2014-07-11 - 17:33 JosephHaley  
PNGpng RODtoROS_3LayerPixel.png r1 manage 618.0 K 2014-07-11 - 17:33 JosephHaley  
Edit | Attach | Watch | Print version | History: r8 < r7 < r6 < r5 < r4 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r8 - 2015-03-19 - JosephHaley
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback