Analysing Outer Tracker Threshold Scans


The Outer Tracker (OT) threshold layer scan is used to measure gain variations in the OT. Threshold scans are performed during LHC operation, using the LHC beam. Gain variations are monitored by studying hit efficiency as a function of the OT electronics amplifier threshold. This Twiki contains instructions on how to analyse the data from such a threshold scan.

The idea of the procedure is as follows: For every layer of the OT (12 in total, 4 layers per OT station) the amplifier threshold of the OT read-out electronics is changed in 10 steps. The steps are defined as

[800 mV, 1000 mV, 1200 mV, 1250 mV, 1300 mV, 1350 mV, 1400 mV, 1450 mV, 1600 mV, 1800 mV] Notice that the threshold scans recorded before June 2011, did not contain the last two thresholds.

As the threshold in the layer under consideration is changed, all other layers are operated at the nominal threshold value of 800 mV, to reconstruct charged particle tracks. The hit efficiency is defined as the number of found hits, divided by the total number of predicted hits, for tracks passing within 1.25\,mm from the wire. For a particular threshold and corresponding layer under study, the hit efficiency is measured in 85\,mm wide bins of the horizontal coordinate x and 56 mm high bins of the vertical coordinate y. The bin size in x corresponds to one quarter of the width of an OT module.

Process the raw data

General info

The data taken during a threshold scan is written to disk in the .RAW format. Typically we select 150 000 events per threshold. In total we have 10 thresholds times 12 layers, which comes down to 120*150 000 events ~ 18 M events. More information about the operational procedure of a threshold layers scan can be found here:

A Bender python script based on the OTHitEfficiencyMonitor in the Brunel Monitoring is used to calculate the hit efficiency. The script uses Brunel for track reconstruction and needs an additional package containing a .h and an .xml file (attached to this page) to be able to use use some C++- classes (for instance the OTLiteTime class) in the python script. A dictionary relating the threshold settings to the corresponding layers is defined in the scripts. The step number which specifies the threshold setting for a particular layer is a variable in the ODIN bank called calibrationStep().

Setting up environment

SetupProject Bender --use-grid --build-env
cd ~/cmtuser/Bender_vXrYpZ
getpack Rec/Brunel
cd ~/cmtuser/Bender_vXrYpZ/Rec/Brunel/cmt
cmt make

Additionaly, add the package with the .h and .xml file and compile it.

The dictionary in the Bender script containing the steps and corresponding layers

The dictionary which maps every threshold to its corresponding layer is defined as follows in the python script:

mydict = {}
beginstep = 1
endstep = 120
nthresholdperlayer = 10
firstthreshold = 1
layer = -1 #Will increase every time the counter modulo (number of steps) is equal to 1
for i in range(beginstep,endstep+1):
if i%nthresholdperlayer == firstthreshold:
    layer += 1
mydict[i]= layer 
Notice that from June 2011 onwards, ten thresholds were defined, instead of eight.

Database Tags

Typically, the .raw data needs to be processed before there is a snapshot of the condition and detector database. Therefore the data type and database tags should be set by hand in the python script. The latest database tags are found here:

For example for the March 2011 run this means you should set the following Bender properties

DataType   = '2011',
DDDBtag = 'head-20110303' ,
CondDBtag = 'head-20110308',
OutputType = 'None'

In addition one should set


Special t0 Database

The OT readout gate was changed in May 2011. However, when performing a threshold scan, the recipe still uses the old configuration (same readout gate for all OT stations instead of interspaced by 2ns). This means when anaylyzing the scans after May 15 2011, a dedicated database with module t0's is used when running the reconstruction:
from Configurables import ( CondDBAccessSvc, CondDB )
AlignmentCondition = CondDBAccessSvc("AlignmentCondition")
AlignmentCondition.ConnectionString = "sqlite_file:2011-05.v2-scan.db/LHCBCOND"

Veto Hlt Errors

The Hlt Error Filter checks whether there are Hlt2 decisions. Since this is not the case here, and we're not interested in them, the Hlt Error should be vetoed by setting the Brunel property
VetoHltErrorEvents = False


It is always a good idea to test the Bender script on a few events to see if it actually works. Reasons for this are for example new software releases and different database tags. Testing can be done by running the python script locally or by submitting one subjobs only and us the Interactive backend in ganga.

Local testing

For this, the 'job steering' part of the Bender script needs a PFN of one raw file in the threshold scan run (in this case running only 100 events):
files = [
        "DATAFILE='castor:/castor/' SVC='LHCb::MDFSelector'"
For local testing of the script, do
SetupProject Bender --use-grid
python -i [Bender script]

Interactive backend of Ganga

Once the script runs OK locally, it's a useful test to run it on the Interactive backend of Ganga, to see if the LFN's are read correctly.

There are some important lines in the submission script, since we want to send along our own installation of Bender. Notice that for using the Interactive backend of Ganga, it's important not to forget the --use-grid option (since the Interactive backend simply starts a subshell and sets up the projects as speciefied in the ganga script):

b = Bender(version = 'vXrYpZ')
b.setupProjectOptions = '--use-grid'
b.user_release_area = '/afs/'

Submit Jobs to the Grid

With everything set up properly, and after assuring yourself the python script works, it is time to submit all the jobs to the Grid using the Dirac backend of Ganga. You should keep in mind that Dirac can only handle 100 input files per job. So split all the RAW files in option files of 100 each, and submit the jobs (with 100 subjobs each) to the Grid. The subjobs typically take about 1.5 days to finish.
SetupProject Ganga
An example of a script to submit the jobs to the Grid is attached to this Twiki.

Merging the ROOT Files

Single Scan Analysis

Comparing Scans

-- DaanVanEijk - 15-Mar-2011

Topic attachments
I Attachment History Action Size Date Who Comment
Header fileh OTDicts.h r1 manage 0.4 K 2012-03-19 - 12:08 DaanVanEijkExCern  
XMLxml OTDicts.xml r1 manage 0.2 K 2012-03-19 - 12:08 DaanVanEijkExCern  
Texttxt r1 manage 15.4 K 2012-03-20 - 12:11 DaanVanEijkExCern Bender python script used to calculate the hit efficiency and fill the histos
Texttxt r1 manage 1.2 K 2012-03-20 - 12:10 DaanVanEijkExCern Script used to submit the jobs to the Grid
PDFpdf TS_Summary.pdf r1 manage 109.1 K 2012-03-19 - 13:27 DaanVanEijkExCern Summary tables for all TS taken so far (as of March 2012)
Unknown file formattex TS_Summary.tex r1 manage 4.1 K 2012-03-19 - 13:32 DaanVanEijkExCern TEX file for TS summary tables
PDFpdf nim.pdf r1 manage 619.5 K 2012-03-19 - 16:02 DaanVanEijkExCern Paper entitled "Radiation Tolerance of the Outer Tracker in 2011", submitted to NIM
Edit | Attach | Watch | Print version | History: r19 | r17 < r16 < r15 < r14 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r15 - 2012-03-20 - DaanVanEijkExCern
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

  • Edit
  • Attach
This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback