Table of contents:

SVN

Some info on how to use svn are here SvnUsageInfo

Getting started with the package

* To get package

 svn co svn+ssh://<username>@svn.cern.ch/reps/atlas-mvanadia/TBReco/trunk ./TBReco

First-time usage:

After downloading, open TBReco/TBReco.h and adjust settings to your working area. In particular, set in InitTestBeamSetup() the following paths:


> INFOLDER = path to place where you have raw ntuples
> OUTFOLDER = path to place where you want to put output root file, "" to avoid storage
> OUTPLOTSFOLDER = path to folder where you want to put the output plots, "" to avoid storage

Other relevant parameters are commented in the class. An example on how to execute is in execReconstruction.sh.

How to get the input ntuples

  • Runbook https://apex.cern.ch/pls/htmldb_atlas/f?p=197:1
  • Runs are stored in pcmmdata.cern.ch (access from lxplus with user:dateuser) in folder /DATA/mmdaq.cern.ch/2013/DESY/APV25
  • Some raw ntuples are stored in /afs/cern.ch/work/m/mvanadia/public/TestBeamMMDesy/raw_ntuples

How to use TBReco

TBReco runs on APV25 input files and produce several files in output. The parameters used for the reconstruction must be set in TBReco/TBReco.h You can run on a specific run with:

root -l 'TBReco.C+("10129")'

or run on several runs together with a command like

root -l 'TBReco.C+("10129,10130,10131")'

or even run on all runs between a certain interval using the command

root -l 'TBReco.C+("10129,10130,10131,10135-10138,10160-10164")'

which will run on runs 10129, 10130, 10131, 10135, 10136, 10137, 10138, 10160, 10161, 10162, 10163, 10164. If in the given interval some files are missing, they will simply be ignored. BE CAREFUL: if one of the runs, e.g. 10136, has a corresponding file run10136.root which is a pedestal file or is just empty and is in your input folder, this approach won't work.

TBReco.C is basically the "main" of the analysis. In the function defined in the file the input file is read and the information are used to create hit and clusters. Also, several user-defined analyses, which have access to the fully reconstructed information, can be run in parallel; this is discussed below. The parameters of the reconstruction can be changed in TBReco/TBReco.h

Reference system

The software uses the following conventions for the reference system

Local RS

In each chamber a local reference system is defined. x is the precision coordinate, y is the second coordinate for the chambers where the readout of that coordinate is available, z is the axis orthogonal to the strip plane and directed starting from the strip plan toward the mesh

Global RS

     x
     ^
     |
     |
     -----> z (direction of the beam)
    /
   /
  y
 

  • the z axis is horizontal and it is the axis of the beam (if the software is used for comsic runs this is actually vertical, but it doesn't matter) and directed as the beam. Chambers are considered oriented positively if they have the readout plane nearer to the beam origin and the mesh farther.
  • the x axis is vertical, for a chamber perpendicular to the beam it corresponds to the direction of the precision coordinate
  • the y axis corresponds to the direction of the precision coordinate

The chamber have an angle which (for now) is defined only in the xz plane. the angle is 0 for chambers perpendicular to the beam, positive for chambers with the top part farther from the beam origin and the bottom part nearer to the origin.

x    /chamber
^   /
|   /
|  /
| /
|/      for a chamber like this one, the angle is positive
 

Reconstruction parameters in TBReco/TBReco.h

In TBReco/TBReco.h the most relevant parameters that will be used for the reconstruction can be setted. The most important settings which needs to be set are at the very beginning of the file

General settings

  • TString INFOLDER = "./raw_ntuples/"; -> this is the folder which contains the input files
  • TString OUTFOLDER = "./reco_ntuples/"; -> in this folder a .root file containing most of the output produced by the software will be saved. The file for the run number XYZ will be placed in a subfolder of that directory named XYZ, and will be called runXYZ_tbreco.root
  • TString OUTPLOTSFOLDER = "./plots/"; -> in this folder severl control plots are saved. The plots are the same stored in the .root file described above
  • TString SETUPNAME = "DESY"; -> this is a crucial parameters. In the framework there is a Runbook, described below, which sets the geometry of the detector system. The setup of most DESY runs, i.e. 2 Tmm chambers, 4 T chambers and then 2 Tmm chambers, with the correct positions and so on, can be used setting this to be "DESY". Other setup are already implemented, they will be described below.

The other parameters can be set below, inside the InitTestBeamSetup function

Reconstruction parameters

  • NMAXEVENTS=-1; -> n. events to run on, negative to run on whole input tree
  • RECO2NDCOORD=false; -> perform reconstruction also for 2nd coordinate of the chambers WHERE AVAILABLE (the second coord was not read for most runs @ DESY for example)
  • FITSTRIPTIME=true; -> perform a Fermi-Dirac fit for each strip. This is needed to have a time measurement for the strips, and therefore for the uTPC reconstruction. Turning this on slows down the reconstruction by an order of magnitude, so do it only if needed
  • DO_UTPC_RECO=false; -> in some cases you may want to have the time measurement for the strip, but the uTPC track reconstruction maybe is not needed
  • MIN_ANGLE_FOR_UTPC_RECO=-5.; -> uTPC reconstruction usually makes sense only for chambers which are inclined wrt the beam. You can exclude uTPC reconstruction in chambers with fabs(angle)<MIN_ANGLE_FOR_UTPC_RECO. You can set a negative value to perform the reconstruction in all chambers
  • CREATEHOUGHCLUSTERS=false; ->clusters are usually reconstructed with a neighbouring strips algorithm described below. Also a second clustering algorithm, based on hough transform, can be used IN PARALLEL to the first one. The neighbouring strips clustering is anyway the default.
  • MINCHARGEMAX = 0; -> minimum strip charge for strips used for cluster reconstruction, in ADC counts. 0 to use all chambers. Values up to ~100 may be used depending on what you want to do
  • MAXCHARGEMAX = 20000; -> maximum strip charge for strips used for cluster reconstruction, in ADC counts. Strips around ~1500 are in saturation, you may want to remove them in some cases

Vdrift and time spectrum parameters

  • VDRIFT= 0.047 ; -> Vdrift = 47 um/ns, this should be the correct value for Ar/CO2 93/7
  • TSPECTRUMMIN=0 and TSPECTRUMMAX=400 are the extremes of the time spectrum, these values should be ok for DESY, they may need to be fixed for ohter setup
  • VDRIFT_X, VDRIFT_Y and VDRIFT_Z are actually not used for now. They may need to be fixed for runs with magnetic field on.

Parameters for cluster reconstruction (with the neighbouring strips algorithm)

  • MINMMClusterSIZE=2 and MAXMMClusterSIZE = 100 -> these are the minimum and maximum number of strips allowed in "good" clusters (i.e. the clusters which will be used for tracking).
  • MAXHOLES=1000 -> this is the maximum number of holes (i.e. strips which did not fire) allowed in "good" clusters (i.e. the clusters which will be used for tracking)
  • NMAXHOLESBETWEENCLUTSERS=1 -> this is the maximum number of CONSECUTIVE holes (i.e. strips which did not fire) allowed when building clusters

Parameters for cluster reconstruction (with the hough transform algorithm)

  • USE_POLAR_HOUGH=false; -> this is to use linear or polar hough transfrorm for clustering, This must be left to false, polar hough transform is experimental
  • MINSIZEHOUGHCLUSTER=3 -> minimum number of strips needed to build a cluster with the hough transform. This shouldn't be less than 3

Just below that the lattice used for hough clustering is defined as H_HOUGH_LATTICE, with a binning which is currently under test. Change it only if you know what you're doing!

Preliminary alignment

TBReco is able to produce a preliminary alignment, by fitting a gaussian to the centroid distribution of the chambers and calculating the difference between all the chambers with respect to the first one in the setup. This makes sense for test beam studies, where the beam is very focused, and clearly makes no sense for cosmic rays.

This alignment correction is stored in the calibration file

  • CALIBFILE="calib.root";

and also in the normal output file of the software, in one TGraphErrors called alignment_gr, with an entry for each chamber giving the position of the chambers in the xz plane This is done if WRITEALIGNMENT si true.

If READALIGNMENT is an empty string (""), no alignment correction is applied when running TBReco. If READALIGNMENT=="this", the software will search in file CALIBFILE the corrections already calculated for this run and will apply them: clearly this can be done only if you're running on the run for the second time, and the first time you already wrote the alignment corrections in the file. You can also put READALIGNMENT="10102", for example, to use the alignment corrections calculated for that particular run when running.

T0 calibration

If the signal is fitted for each strip and a time measurement is performed, the software can perform a Fermi-Dirac fit to the total distribution of the time measured in each chamber to calculate the chamber-by-chamber T0. If WRITET0CALIB==true these T0 corrections are stored in the filed named after T0CALIBFILE="calib_t0.root". READT0CALIB can be "", "this" or a specific run numbers to read a correction already calculated, with the same conventions described in section above for the alignment.

As an alternative, if READLIGNMENT="" and you're therefore not reading any T0 correction, you can use GLOBALMINTIME=XXX to apply a general T0 correction to all the chambers.

Debug and event display options

These can be used to produce some nice event display and to understand what's going on

  • DEBUG_HOUGH_CLUSTERS=false; -> if this is true, an event display is produced for each cluster created with the hough transform. Double-click on the canvas to proceed
  • DEBUG_TIME_FIT=false; -> if this is true, the Fermi-Dirac time fit is shown for each strip. Double-click on the canvas to proceed
  • DEBUG_CLUSTERS=false; -> if this is true, an event display is produced for each cluster created with the neighbouring strips clustering. Double-click on the canvas to proceed
  • STORE_FIRST_CLUSTERS=-100; -> save the first N event display for cluster reconstruction as images. If negative, this is not done
  • STORE_FIRST_HOUGH=-100; -> save the first N event display for hough cluster reconstruction as images. If negative, this is not done
  • DEBUG_PLOTS_FOR_CHAMBER=""; -> if this is "", plots for all chambers are shown/saved when debug options are turned on, otherwise only for given chamber, e.g. "P3_T1"

Working principles of TBReco

in the input tree, APV signals are stored. For each strip there is a 27-bin charge vs time distribution. The charge of the strip is the content of the highest bin of the distriubtion, while the time of arrival of the signal can be measured with a Fermi-Dirac distribution fit

TF1("fermiplus","([0]/(1+exp(-(x-[1])/[2])))+[3]",...)


fit.png

TBReco.C reads these dsitrbution, and for each strip creates a MMHit object. You can llok at TBReco/MMHit.h to look at which methods are available.

MMHit inherits from class Position, defined in TBReco/Utils.h, i.e it has methods as hit->x(), hit->y(), hit->ex() etc to get the position and the error in local coordinates. If the corresponding option is turned on, the Fermi-Dirac fit is performed for each strip.

Then, the hits are added to the corresponding chamber. MMChamber is a class defined in TBReco/MMChamber.h (again, look at the header to see which methods are defined). In each event, the strips firing in one chamber are added to the correspoding MMChamber. The vector of all chambers used in a particular setup is V_CHAMBERS, so V_CHAMBERS[0] is for example the pointer to the first MMChamber of the setup.

After adding the strips to the corresponding chambers, clusters are created in each chamber. This is done with the neighbouring clustering algorithm described below, and, if the corresponding option is turned on, also with the hough transform clustering. At this point in each chambers you have the vector of all strips and the vector of all clusters; they can be accessed with the methods described in the header of the class. Also, from each cluster you can get the vector of the strips included in the cluster. After that, if the corresponding option is turned on for each cluster the uTPC track is fitted and the TBReco reconstruction is complete.

Using a user-defined analyses

User customised analysis can be easily added to the framework. There is a class called DummyAna which can be used: simply copy the header and the src file, change the name, and fill the methods which are already in. Then you need to add your own defined analysis to the vector of analyses which are executed. They are very flexible: for example, there is a dummy analysis which fills the output ttree (TreeFillerana), the analysis which produces the event displays (ClusterDisplayAna), the class which produces alignment and t0 calibration (TimeCalibrationAna) and so on, look at them to get an idea on how to implement your own analysis.

All these analysis have the same structure which must be used:

-> an Init() method, where for example the histograms you want to fill are defined, or maybe also an output file where you want to save them -> an Execute_event() method, where for example the histograms are filled. THIS METHOD IS CALLED EVERY EVENT BY TBRECO, just after the reconstruction. FROM INSIDE THIS METHOD YOU HAVE ACCESS TO THE VECTOR _chambers WHICH CONTAINS ALL THE INFORMATION YOU MAY NEED. For example, from each chambers you can get the vector of reconstructed clusters, from each cluster the vector of strips, from each strip you can get the charge, and fill an histogram with the strip charges. -> a Finalize() method, which is run at the end of the event loop. Here for example you can fit your histograms and save them in your own-defined output file.

The Runbook

How to use RecoAna

In folder analysis/ of the TBReco package there is software to process the reconstructed ntuples. To use it:

  • first time you should run setup.sh in analysis folder. Also, set the path of the folders you want to use with the software changing the first lines of RecoAna.C
  • the software can be run with root -l 'RecoAna.C+("10129")'

New analyses classes can be created from the example class DummyRecoAna. They must then be added in RecoAna.C to the list of analysis which are executed.

In RecoAna.h there are two very simple classes defined, Hit and Track; look at them to see the methods already implemented. Hit inherits from Position (defined in TBReco/Utils.h), so it has method like hit.x(), hit.ex() and so on defined. RecoAna.C automatically fills two vector of Hit*, CENTROID_HITS and UTPC_HITS. They are global, so all DummyRecoAnalyses can use them.

Track reconstruction is performed by the TrackingRecoAna class. In RecoAna.C, where the TrackingRecoAna object is defined, one can decide how to perform the tracking.

The constructor is TrackingRecoAna * TrackAna = new TrackingRecoAna(_rt, TrackingRecoAna::ROOTLINEAR, runname); The second parameter can be:

  • ROOTLINEAR to use normal root linear fitter
  • MINUITLINEAR to use minuit linear fitter
  • MINUITLINEARTJCORRECTION to use minuit linear fitter applying time-jitter correction (this makes sense only if uTPC is used for some chambers)

Then other parameters of the reconstruction can be setted with:

  • TrackAna->setChamberToUseForTracking(use_chamber_for_tracking) where use_chamber_for_tracking is a vector of bool, one for each chamber, true to use the chamber in tracking, false not to use it
  • TrackAna->setUseUTPCInChamber(use_utpc_in_chamber) where use_utpc_in_chamber is a vector of bool, one for each chamber, true to use utpc in the corresponding chamber, false to use centroid
  • TrackAna->setMinClusterRequiredInchamber(min_cluster_required_in_chamber) where min_cluster_required_in_chamber is a vector of int, one for each chamber, which is the minimum number of hits required in each chamber
  • TrackAna->setMaxClusterRequiredInchamber(max_cluster_required_in_chamber) where max_cluster_required_in_chamber is a vector of int, one for each chamber, which is the maximum number of hits required in each chamber

The fitted tracks are stored in the global vector<Track*> TRACKS, which is cleaned at the beginning of each event. For now actually only a maximum of one track per event is reconstructed, so either TRACKS is empty if there is no reconstructed track in the event, or you can just use TRACKS[0]. The list of hits used for the track can be retrieved using the methods of class Track, so if you need to evaluate residuals or perform other actions on the track you can easily do it.

-- MarcoVanadia - 12 Feb 2014

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng fit.png r1 manage 16.3 K 2014-07-04 - 17:31 MarcoVanadia  
Unknown file formatxlsx run1.xlsx r1 manage 55.7 K 2014-03-28 - 15:30 MarcoSessa Runbook - Desy Test Beam
Edit | Attach | Watch | Print version | History: r25 < r24 < r23 < r22 < r21 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r25 - 2018-05-08 - MarcoVanadia
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback