Data-driven Alignment Validation Tool


Complete: 3

Update of this page is on-going. Please ask Tapio.Lampen@cernNOSPAMPLEASE.ch for details about this tool.

A tutorial on this validation tool was presented in July 2016.

Offline Track Validation tool

The Offline Track Validation tool uses particle tracks (either real or simulated) to measure the quality of alignment and related effects. This is carried out by examining the residual distributions of individual modules and larger structures. Also global track variables (angular distributions, $ \chi^2 $ values, pt, curvature etc.) can be studied. The tool is used within the all-in-one meta validation tool.

Quick recipe for usage via the all-in-one tool

The all-in-one validation tool is the easiest way to produce various plots with the offline track validation tool. In this chapter the default plots are described as well as those which can be produced with the different parameters of the all-in-one tool. The next chapter (to be written) describes in detail different parts of the offline track validation tool and tells how details of plots can be modified, and how some other quantities can be studied.

Please refer to the all-in-one TWiki page to see how the tool is run.

The minimal set of settings you have to specify for in the 'offline' section of the all-in-one tool is (other settings are optional):

  • maxevents number of events to be used ('-1' runs over all events in the dataset)
  • dataset
  • trackcollection (if set wrong, no tracks are found)
  • jobmode

This minimal set of settings produces these default plots:

  • DMR plots (distribution of median of residuals)
  • DRnR plots (distribution of RMS of normalized residuals)

These are produced for all subdetectors (BPIX, FPIX, TIB, TOB, TID, TEC) in the precise coordinate X, and for BPIX and FPIX also in the other coordinate Y.

With the following settings you can produce some other, often needed plots:

  • SurfaceShapes=coarse (surface shape plots or bow plots, notice that offlineModuleLevelProfiles has to be set true)
  • DMROptions=plain,split (in addition to the DMR/DRnR distribution, splitted distributions are plotted in the same plot)
  • DMROptions=layers (shows distributions for each layer or disc in same plot)
  • DMROptions=layer=N (distribution of layer/disc N only)

The tool can also be run in parallel mode. The name of the validation has to be offlineParallel, and the setting parallelJobs has to be specified as well as the exact number of events. A large number of parallel jobs leads to a long time needed in merging the results, and therefore it is usually advisable to use a small number of parallel jobs (of the order of 10).

Advanced use

DMR and DRnR plots

Full control of the DMR and DRnR plotting can be achieved by modifying manually the TkAlExtendedOfflineValidation.C file, which is in the same directory as TkAlMerge.sh. This way one can, for instance, produce in different plots the plain DMR plots, the splitted plots and separate plots for each layer (if BPIX is of interest, then 3 layers are sufficient):

//  p.plotDMR("median,rmsNorm",30,"plain,split");
  p.plotDMR("median",30,"plain");
  p.plotDMR("median",30,"split");
  p.plotDMR("median",30,"split,layer=1");
  p.plotDMR("median",30,"split,layer=2");
  p.plotDMR("median",30,"split,layer=3");

Note that rerunning ./TkAlMerge.sh does not repeat merge of parallel jobs, it is only done once. Therefore rerunning is much quicker. If surface shape plots are plotted, this probably takes most of the time then (and this can be turned off).

Normalized $ \chi^2 $ and other track variable distributions

The normalized $ \chi^2 $ distribution, as well as quite a few other distributions, are saved in a ROOT file, from which it can be obtained and plotted. There is no ready-made script for this purpose at the moment, so some manual work is required.'

  • find out where the file OfflineValidationParallel_result.root or OfflineValidation_result.root lies in eos: have a look at line 19 of TkAlMerge.sh
  • copy this file to a working directory like this (copy also the dot in the end of the line):
cmsStage /store/caf/user/$USER/AlignmentValidation/SomeNameForTheValidation/OfflineValidationParallel_result.root .
  • browse the file with ROOT and you will find the h_normchi2 distribution from there in the directory GlobalTrackVariables

X-axis range in DMR plots

The X-axis range of DMR plots is defined in this file:

Alignment/OfflineValidation/macros/PlotAlignmentValidation.C

You can modify the settings of the plotinfo variable. Then, after recompiling, your plots will have a new y-range.

The default value is 100 bins in range +-10 um, bin width 0.2um. If you modify PlotAlignmentValidation.C (especially increase resolution), please check also that the DMR definitions in TrackerOfflineValidationSummary_cfi.py and TrackerOfflineValidation_Standalone_cff.py are compatible. Otherwise you might get empty bins in your histograms.

Warning: the text below is obsolete, read with care

Scope

This tool is conceived to display various distribution after running over a data-set with a certain alignment setting applied.

The distribution can be grouped in two main categories:

Global Track Quantities

These are quantities which are not related to any specific detector part but instead are common among whole tracks

In the output file they will be saved in the directory "GlobalTrackVariables"

  • $ \chi^2, \chi^2(\phi), \chi^2(\eta) $ (same also available for $\chi^2/ndof$ )
  • $d_0, d_0(\phi), d_0(\eta) $ (transverse impact parameter)
  • $d_z, d_z(\phi), d_z(\eta) $ (longitudinal impact parameter)
  • $\kappa_\pm, \kappa(\phi) $ (curvature of positive and negative tracks)
  • $ \kappa_+ - \kappa_- $ (difference of the two curvature distributions)
  • $ p_T  $(transverse momentum)
  • $ \delta_{p_T} / p_T   (\eta) (\phi)$ (transverse momentum resolution )

Hierarchy-based Quantities

These quantites are displayed for a certain part of the detector. It follows the track hierarchy of the respective release (not compatible with versions below 1_8)

On each hierarchy level (above module level) you have two kinds of distributions:

  • histogramms showing the summed residuals of all structure below this hierarchy level combined
  • summary histogramms with one bin for each sub-structure. The bin center is the mean value of the residual of this substructure, the error bar gives the Width of the respective distribution (not the error of the mean!!) Beginning from the last tag V02-03-00 you can choose if you want the RMS or the FWHM of the distribution, steerable via cfg (parameter useFwhm)

A redundancy which was still present untill V02-02-02 has been removed in the latest tag

The following quantities are available in tag V02-03-00 by default:

  • native X residuals for all subdetectors and all hierarchy levels
  • native Y residuals for Pixel detector
  • Normalized native X residuals for all subdetectors and all hierarchy levels
  • Normalized native Y residuals for Pixel detector

The following distributions can be included in the output via cfg steering:

  • local X coordinates for all subdetctors and hierarchy levels
  • Normalized local X coordinates (as above) (both steerable via: localCoorHistosOn)
  • Y and normalized Y residuals for Strip detectors (parameter: stripYresiduals )

The histogramms on module level can be made transient, i.e. they will be booked during the running and summary histogramms are created from them, but they are not saved to the output file. (parameter: moduleLevelHistsTransient)

Tags at or below V02-02-02

In the main directory of the module you find several histogramms for the various subcomponents of the Strip and the Pixel tracker:

  • h_Residuals_* gives the residual in normal local coordinates of all modules in the respective subsystem (PXB, PXE, TIB, TID, TOB, TEC)
  • h_normResiduals_* gives the residuals divided by the square root of the variance
  • h_XprimeResiduals_* gives the residuals in 'native local' coordinates (x for barrel, rphi for endcaps, rotated for stereo modules)
  • h_summary* gives the mean and RMS of the residual distributions of all subdetectors for Strip and Pixel
  • h_Strip_0, h_Pixel_1 give the total residual for the whole strip tracker and the whole pixel in one histogramm

The bold marked histogramms can be found on each hierarchy level.

Running

The latest Tag available for the tool is V02-00-07 (cvs co -rV02-00-07 Alignment/OfflineValidation )(as of 22-Apr 2008) this version should work with CMSSW_2_0_X . An example cfg file is in the test directory ( offlinevalidator.cfg )

For a "back-port" for 1_8_X see the respective section of this page

Combining Results

A script to conviniently combine results from jobs with different alignment DB object can be found in the scripts directory (compareAlignments.cc)

To combine results:

  • go to the 'scripts' directory,
  • open root in batch mode (root -b) → normal mode is strongly discouraged as your output will be saved as canvases which will all be displayed at least for the fraction of a second
  • compile the script ( ".L compareAlignments.cc+" )
  • run the script, parameter is a comma-separated list with the syntax: compareAlignments(filename.root=legend entry,filename2.root=legend entry2)
  • leave root batch-mode and browse result file with normal root ( name is "result.root" so far but feel free to change this according to your liking)
  • in the script you can specify up to which level the histogramms should be combined, so far the lowest level above Module level is chosen here. N.B. if you change this to combine also on module level, the output file will probably become very large (as you are combining >15000 histogramms)
If you experience any problems with the package, the script or the documentation feel free to contact erik.butz@cernNOSPAMPLEASE.ch

If you have requests for other plots which are not yet in the package but you think should be in, you can also contact erik.butz@cernNOSPAMPLEASE.ch

184 Backport

To run this tool in 184 following the following steps

1. Download this archive which contains all necessay files to run the tool in CMSSW_1_8_4.

2. Create a release area for 184 move the tar.gz to the src directory and extract it (

tar xvfz trackerofflinevalidation184.tar.gz

This will create the directory structure for Alignment/OfflineValidation.

The other OfflineValidationTools are not part of this archive. If you have already checked out the Alignment/OfflineValidation package you may run into trouble since the structure of the package changed very much from 184 to 200, therefore I did not intend to adapt the structure of the directories and buildfiles but instead I only provide those files which are needed to run the package in 184 in the relevant directories.

Note: In the cfg file the following points should be noted:

  • The AlignmentTrackSelector is used to pre-filter the tracks.
  • A second refitter is included in order to split the MatchedRecHits (replace TrackRefitter.useHitsSplitting = true)

CSA08

For the latest tag to be used within the CSA08 please look at the respective section on the

Validation Tool Main Page

Recipe to run in CMSSW 2_0_6

Running the validation tool

N.B. If you have run the Tutorial for CSA08 for 206 of the GeometryComparisonTool you should be able to skip the first two steps

1. Make a clean release in CMSSW_2_0_6

scramv1 p CMSSW CMSSW_2_0_6
cd CMSSW_2_0_6/src
eval `scramv1 runtime -csh`
project CMSSW

2. Check out code for CSA08 from here and compile

cd Alignment/OfflineValidation
scramv1 b

3. Go to the test directory and open the example configuration file

cd test
emacs offlinevalidator.cfg

4. Replace the dataset in the PoolSource with the dataset of your choice

This would be some 2_0_6 CMS.RelVal CMS.MinBias:

'/store/relval/2008/5/4/RelVal-RelValMinBias-1209251027-STARTUP_V2-3rd/0000/64C0FFC3-141A-DD11-AE00-000423D99AAA.root',
'/store/relval/2008/5/4/RelVal-RelValMinBias-1209251027-STARTUP_V2-3rd/0000/86ADE1B9-1E1A-DD11-9B05-000423D99AA2.root',
'/store/relval/2008/5/4/RelVal-RelValMinBias-1209251027-STARTUP_V2-3rd/0000/CCA0E2CB-181A-DD11-8E90-000423D9890C.root'

5. Choose a db Object you want to apply to the data and write the respective section to the cfg file

  es_source = PoolDBESSource {
        using CondDBSetup

        string connect  = "sqlite_file:/afs/cern.ch/put/yourFile/here.db"
        string timetype = "runnumber"

        VPSet toGet =
        {
            { string record = "TrackerAlignmentRcd"      string tag = "valueTag" },
            { string record = "TrackerAlignmentErrorRcd" string tag = "errorTag" }
        }

include the following second db access to get the GlobalPositionRcd

 es_source GlobalPositionSource = PoolDBESSource {
        using CondDBSetup
        string connect="frontier://cms_conditions_data/CMS_COND_20X_ALIGNMENT"
        # untracked uint32 authenticationMethod = 1
        VPSet toGet = {
            { string record = "GlobalPositionRcd" string tag = "IdealGeometry" }
        }
    }

If you want an APE of zero one way to do this is to take the AlignmentErrorRcd from the ideal geometry.

For this remove

{ string record = "TrackerAlignmentErrorRcd" string tag = "errorTag" }
from the first DBEsSource (don't forget the ',' after the first line) and add the following line to the second DBESSource
{ string record = "TrackerAlignmentErrorRcd" string tag = "TrackerIdealGeometryErrors200_v2" }

If you need to access the beam spot you can add this line

include "RecoVertex/BeamSpotProducer/data/BeamSpotEarlyCollision_IntDB.cff" 

which includes a BeamSpotObjectsRcd via Frontier.

to run with ideal geometry set

replace TrackerDigiGeometryESModule.applyAlignment = false

6. Adjust the name of the output file according to your needs

     service  = TFileService {
       string fileName = "yourfilename.root"
     }

7. Run

cmsRun offlinevalidator.cfg
Output files will have something like 40MB

8. Repeat steps 5. - 7. with the configuration you want to compare with

Comparing results

1. go to the 'scripts' directory and open root in batch mode:

 
root -b
normal mode is strongly discouraged as your output will be saved as canvases which will all be displayed at least for the fraction of a second

2. compile and run the script

root [0] .L compareAlignments.cc+
run the , parameter is a comma-separated list with the syntax:
root [1] compareAlignments(filename.root=legend entry,filename2.root=legend entry2)

3. Leave root batch mode, start open the result file and browse distributions at will

root -l results.root
root [0] TBrowser b

Additional plots using the TTree

Idea

To produce additional plots using the TrackerOfflineValidation tool check out Alignment/OfflineValidation with tag *V02-06-03 *. Running the tool writes out an additional TTree to provide a flexible usage of different variables concerning the tracker geometry and mean and RMS values of the residuals' distribution on module level. In contrast to the summary histograms provided by the tool so far, the tree allows a modulewise view on the residuals. Mean and RMS are written out for each module (absolute and normalized), such that systematic effects or single outlier module can be identified easily just 'scanning' the TTree. Furthermore outlier modules can be identified and located and the corresponding histogram on module level can be investigated to get detailed information.

Recipe to run in CMSSW 2_2_X

cvs co -r V02-06-03 Alignment/OfflineValidation
scram b
Now one has to configure the python config in Alignment/OfflineValidation/test/offlinevalidator_cfg.py according to the datasample, track collection ,etc... . For release 2_0_X and above there is the opportunity to switch on and off the histograms on module level as well as the histograms using the local coordinates in order to save disk space.
process.TrackerOfflineValidation.moduleLevelHistsTransient = True  #will not save modules on module level
process.TrackerOfflineValidation.localCoorHistosOn = False  #histogram for local coordinates will not be created
After configuration the tool can be run via
cmsRun Alignment/OfflineValidation/test/offlinevalidator_cfg.py

If you do not run a single job you can merge the output using the merge_TrackerOfflineValidation.C in Alignment/OfflineValidation/scripts. It is a modified version of the hadd root macro. BUT this macro does only work if the histograms on module level have been saved.

To receive a set of DMR(distribution of the median of the residuals on module level) plots from the tree there is a macro in Alignment/OfflineValidation/macros named runExtendedOfflineValidationPlots.C. To run the macro the different output files of the offline validation must be configured. The first file is set like

 ///add file that you want to overlay following the syntax below
  ///(std::string "fileName", std::string legName="", int color=1, int style=1)
     PlotAlignmentValidation p("/afs/cern.ch/cms/CAF/CMSALCA/ALCA_TRACKERALIGN/CRAFT_Note/draft0_27may2009/data/FinalIdeal.root","Design",1,1);
   
All the following files that shall be overlayed, are added by the method 'loadFileList'
p.loadFileList("/afs/cern.ch/cms/CAF/CMSALCA/ALCA_TRACKERALIGN/CRAFT_Note/draft0_27may2009/data/FinalHIP.root","HIP",3,1);

The tool allows the flexible usage of the mean and median values stored in the tree for both local coordinates x and y (medianX, medianY, meanX, meanY). In addition the minimal number of hits per module can be set, editing the line:

 p.plotDMR("medianX",30);

Using

 
root -l runExtendedOfflineValidationPlots.C 
the DMR plots are produced and saved in the $TMPDIR directory if there is no output dir specified. Chenging the output directory can be done by editing
 
 p.setOutputDir("$TMPDIR");

The PlotAlignmentValidation.C tool which is configured by the runExtendedOfflineValidationPlots.C to produce DMR can also be used standalone following the disription below. In addition to the DMR plots one can also scan the tree for 'outlier' modules with either a large chi2 per dof on module level (stored in chi2perDofX To use it just start root and do:

root [0] .L PlotAlignmentValidation.C++
Info in <TUnixSystem::ACLiC>: creating shared library /afs/cern.ch/user/j/jdraeger/scratch0/cms/Alignment/CMSSW_2_2_10/src/Alignment/OfflineValidation/macros/./PlotAlignmentValidation_C.so
/usr/bin/ld: skipping incompatible /usr/lib64/libm.so when searching for -lm
/usr/bin/ld: skipping incompatible /usr/lib64/libm.a when searching for -lm
/usr/bin/ld: skipping incompatible /usr/lib64/libc.so when searching for -lc
/usr/bin/ld: skipping incompatible /usr/lib64/libc.a when searching for -lc
root [1] PlotAlignmentValidation p("rfio:/castor/cern.ch/user/j/jdraeger/Validation/MC_Cosm225/Validation_MC_IDEAL.root","IDEAL",1,1)
'' = $TMPDIR
root [3] p.plotOutlierModules("OutlierModules.ps", "chi2PerDofX",8,30) #("output.ps","cut variable",max value/cut value,number of entries)

The output is wrtten to the $TMPDIR/OutlierModules.ps.

Merging the output files of track based validation

As the information written to the tree is taken from the residual histograms on module level, it is essential to have these histograms in the root file by setting the option
moduleLevelHistsTransient = cms.bool(False)
in the config to produce the offline validation results. Due to the hierarchical directory structure, the merging takes uite a while so a trade off between producing the single output files and merging has to be found. The macro written to merge the output is a hacked hadd version. The first part is simply working like hadd, merging directories and histograms. The tree is skipped in the merging process and dealt with in the second part of the macro. The fixed variables written to the tree are simply copied, like module id, positions and orientations. The mean, median and rms values have to be recalculated from the merged histograms on module level. The macro can be found in:
Alignment/OfflineValidation/scripts/merge_TrackerOfflineValidation.C

There are detailed instructions/comments in the header of the file. The macro needs the

 #include "Alignment/OfflineValidation/interface/TkOffTreeVariables.h"

Thus, the following lines have to be added to the users rootlogon.C, which is loaded when root is started:

#include "TSystem.h"
 if (gSystem->Getenv("CMSSW_RELEASE_BASE") != '\0') {
printf("\nLoading CMSSW FWLite...\n"); 
gSystem->Load("libFWCoreFWLite");
AutoLibraryLoader::enable();

If the library is not loaded, there will be a segmentation violation after the copying of the tree variables:

Copy info from first TTree.
Copy entry 0

 *** Break *** segmentation violation

-- ErikButz - 16 Jun 2008

Edit | Attach | Watch | Print version | History: r17 < r16 < r15 < r14 < r13 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r17 - 2017-07-27 - TapioLampen
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback