DESY 2019/20/21 test beams

DESY machine status

Useful for remote monitoring. Relevant for us: DESY II and LINAC II.


In February and March 2019 (TB21) we used the

  • this one has a broken channel 3
  • firmware 0x1e000014

In June and July 2019 (TB21) we used

  • with this one we saw time jumps in APX but not in TPX3 (also after swapping their HDMI cables/ports) in June
  • the jumps have not been observed in July
  • firmware 0x1e000014

In September 2019 (TB21, telescope DATURA) we used the

  • firmware 0x1e000024

In December 2019 (TB24, telescope AZALEA) we used the

  • AIDA TLU which is "integrated in TB24"
  • firmware 0x1e000024
In February 2020 (TB22) we used the
  • AIDA TLU "DURANTA" again
  • firmware 0x1e000024

In June 2020 (TB24) Simon and Lennart used the

  • AIDA TLU from TB21 (no label)
  • firmware 0x1e000024 and 0x1e000025 (with David's bug fixes)

In July 2020 (TB24) we used the

  • AIDA TLU from TB21 (no label)
  • firmware 0x1e000025 (with David's bug fixes)

In August 2020 (TB21) we used the

  • firmware 0x1e000025 (with David's bug fixes)

Network setup (February to September 2019):

local network DATURA: 192.168.21

For ssh, use 192.168.21.X where X is:

  • .2 for NI Crate Windows Computer for Mimosa Telescope and DUT motion stage control: Can be accessed through Remmina via pc in the hut

  • .3 for in the Hut: used for remote control.

  • .113 for Caribou Zynq FPGA board: Caribou producer. After preparing a fresh SD card, this IP address needs to be set in /etc/network/interfaces.

  • .115 for teleuser@fhleladdaq: SPIDR PC for Timepix3 plane, the IP of this PC cannot be configured by the user (not even with root access), contact Ingo Martens <>, alternatively boot UBUNTU from live USB, mount hard drive and change it there. After that reboot (see ELOG).

  • .204 for Keithley 2450 HV sourcemeter

The TLU is in a different subnet:

Network setup (December 2019):

The ELAD PC has to network connections. The port "I" has to be connected to the internal DESY network (use switch below runcontrol PC) and port "D" to the external network for booting.

local network AZALEA: 192.168.24. Everything else as in previous beam periods (see above). The IP addresses had to be updated in all the config files etc. [replace 21 by 22, 24 depending on TB zone], for changing the IP of teleuser@fhleladdaq, contact Ingo Martens <>.

On the Caribou SD card, the IP address needs to be updated in /etc/network/interfaces:

iface eth0 inet static
auto eth0 address
#up route add default gw (optional if gateway is set, is it up or ip??)

The lower block starting with
# Ethernet/RNDIS gadget (g_ether)
# ... or on host side, usbnet and random hwaddr
iface usb0 inet static
can be commented out completely.

Starting the runcontrol

  • on fhlrcdatura (runcontrol PC) go to ~/CLICdp and "source"
  • then go to "startup_scripts" and run the desired script, e.g. "./06_aida_tlu_telescope_spidr_caribou_TriggerIDSync"
  • then start all the producers, see below

Starting producers

Analysis scripts

  • Corryvreckan setup on RC PC: ~/CLICdp/corryvreckan. Analysis scripts: ~/CLICdp/testbeam-analysis/macros/DESY_2019-02

Remote login to terminal PC in control room

  • Make sure the x11 vnc server is running on the PC (script (The PC name is fhltb21 / fhltb22 for the other areas.)
  • Tunnel setup on your local machine: ssh -L (This routes the port 5900 on the TB machine to the local port 5901.)
  • start vnc viewer on you local machine, connecting it to local port 5901. e.g. on Mac: Open finder - connect to server - vnc://localhost:5901. Password is the same as the standard teleuser password during the December 2019 test beam (Kanzler Kohl...).

Mechanical setup

  • Rotation stage is fixed with intermediate distance plate at moving stage arm (need 6xM4, 12 mm). DUT holder is fixed with 3xM4, 10 mm to rotation stage.
  • DESY contact for telescope mechanics (stage and distance plate): Jan Dreyling Eschweiler.
  • Base plate for DUT moving stage: 6 x 2 holes with 150 mm distance.

Alignment recipe

  • For CLICpix2 and moving stage setup as in June 2019 testbeam:
  • Take short test run (typically 2 ms shutter, 5.4 GeV)
  • Align DUT to have 1d residuals at 0,0
  • Move x-stage 511 according to offset shown in 2d x-correlation plot. shift to left side of DUT axis corresponds to too far downstream right --> move to downstream left (lower values in PI Mikro Move for axis M-511.DD).
  • Move y-stage 521 according to offset shown in 2d y-correlation plot. shift to left side of DUT axis corresponds to too far down --> move up (lower values in PI Mikro Move for axis M-521.DD).

Backup/Storage of Data

Data are temporarily stored on user eos during the test beam, using the command "rsync -rvhpt clicdp_2019XX"

  • Data from February 2019: /eos/user/t/tvanat/clicdp_201902
  • Data from March 2019: /eos/user/s/simonspa/clicdp_201903
  • Data from June 2019: /eos/user/j/jekroege/clicdp_201906
  • Data from July 2019: /eos/user/j/jekroege/clicdp_201907
  • Data from December 2019: /eos/user/d/dannheim/clicdp_201912
  • Data from February 2020: /eos/user/d/dannheim/clicdp_202002
  • Data from June 2020: /eos/user/j/jekroege/clicdp_202006_tlu
  • Data from July 2020: /eos/user/j/jekroege/clicdp_202007
  • Data from August 2020: /eos/user/j/jekroege/clicdp_202008
  • Data from May/June 2021: /eos/project/e/ep-rdet/WG1-Silicon-detectors/WG1.4-Characterization-Simulation/test-beam-data/clicdp_202105
Afterwards we copy the data to the grid storage element (see below):
  • /ilc/prod/clic/clic_silicon/testbeam/DESY_TB_Month_20XX (directly available from lxplus in /eos/experiment/clicdp/grid/ilc/prod/clic/clic_silicon/testbeam/DESY_TB_Month_20XX).

How to copy files to a grid storage element like CERN-DST-EOS:

A valid grid certificate needs to be available. This can be obtained by following the instructions here:

After having obtained a grid certificate and registration in the VOMS server and the "clic_silicon" group in iLCDirac, data can be copied to a grid storage elemenbt as follows. Note: For now this only works when the data is stored on somebody's /eos/user/y/yourname space, later we should get it to run with docker containers on the runcontrol PC.

  • Log into lxplus and do:
    • source /cvmfs/ (Note: make sure you're not sourcing something like ~/software/corryvreckan/etc/ in your ~/.bashrc or ~/.bash_profile on login, this will cause a conflict)
    • dirac-proxy-init -g clic_silicon (enter the passwd you used when setting this up!)
  • Then use the following command to synchronize the local test beam directory to the grid space:
    • dirac-dms-filecatalog-cli
    • create new directory for corresponding beam period: mkdir /ilc/prod/clic/clic_silicon/testbeam/DESY_TB_Month_20XX
    • type exit
    • execute dirac-dms-directory-sync LOCAL_PATH_TO_TB_DATA /ilc/prod/clic/clic_silicon/testbeam/DESY_TB_Month_20XX CERN-DST-EOS -j16
  • dirac-dms-directory-sync can be parallelized by using the flag -j same as in make.
  • Files cannot be overwritten. If a file changed since the last sync, it has to be removed manually, otherwise the synchronization will not complete. This happens mostly to the run logfile, e.g.:
    • dirac-dms-remove-files /ilc/prod/clic/clic_silicon/testbeam/CERN_TB_Month_20XX/data/global_log.txt

  • Files (LFNs) on CERN-DST-EOS correspond to files in /eos/experiment/clicdp/grid+ so that they can be conveniently read
  • Another feature that is handy some times is dirac-dms-filecatalog-cli. This opens an interactive shell in which you can use
    • cd /ilc/prod/clic/clic_silicon/
    • ls, mkdir, rmdir, ...

  • For removing files create a document which lists the files which you want to delete. To do so run:
    • dirac-dms-find-lfns Path=/ilc/path/to/the/folder > fileContainingLFNs (/ilc/path/to/the/folder contains all files which will be deleted)
  • Then run:
    • dirac-dms-remove-files fileContainingLFNs
    • or dirac-dms-create-removal-request All fileContainingLFNs
  • the latter will create requests that the RequestManagementSystem will treat asynchronously for you so that the removal of many files will not block your system for a long time.

How to set up Corryvreckan on your local machine:

Before setting up Corryvreckan itself, three additional software frameworks have to be installed. The following section describes how to install the necessary components and which cmake flags to set

  • Clone peary ( and follow the installation instructions given in the manual. Set the cmake flags
  • Additionally, turn on all cmake flags for the devices you want to use. For instance, if a CLICTD analysis is planned set BUILD_CLICTD=ON

  • Clone Simon's EUDAQ fork ( and switch to the caribou branch. Install EUDAQ with the following cmake flags:
    • EUDAQ_BUILD_EXECUTABLE=OFF (this should be ON if EUDAQ is also used for test-beam)
    • EUDAQ_BUILD_GUI=OFF (this should be ON if EUDAQ is also used for test-beam)
  • Moreover, the PATH to SPIDR and peary has to be exported:
    • export SPIDR_INCLUDE_DIR = path/to/your/spidr/installation/software/SpidrTpx3Lib
    • export SPIDR_LIB = path/to/your/spidr/installation/software/Release/
    • export Peary_DIR = path/to/your/peary/installation/share/cmake/Modules

For setting up the software on the runcontrol PC additional dependencies have to be installed: doxygen, libreadline-dev and eigen. To install them it's necessary to switch to teleadm and run:

  • sudo apt-get install -y doxygen
  • sudo apt-get install -y libreadline-dev
  • sudo apt install libeigen3-dev

How to setup Corryvreckan on LXPLUS:

For step-by-step instructions see: Please report any further problems to Jens.
Edit | Attach | Watch | Print version | History: r45 < r44 < r43 < r42 < r41 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r45 - 2021-08-12 - DominikDannheim
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CLIC All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback