Alignment Validation with PV Unbiased Residuals Tool

The Primary Vertex (PV) Validation is a tool for Tracker Alignment Validation, intended to spot biases in the description of the Pixel geometry, by using unbiased track residuals. It currently runs daily, reporting results at Marco Musich's CERN web page. In the last sections some introductory material is linked.


The PVValidation tool can be used with collision data samples (MinimumBias) (both RECO and ALCARECO). From the track collection a list of "good" tracks, fulfilling some quality cuts, is extracted. Currently tracks should fulfill the following cuts:

Variable cut
Track χ2/ndf <5
Pixel hits >2
SiStrip hits >7
Track pT >1. GeV
Track quality "any"
Hits in the first PXL layer >1

This cuts are implemented via the TkFilterParameters field of the configuration file:

TkFilterParameters = cms.PSet(algorithm=cms.string('filter'),
                                     maxNormalizedChi2 = cms.double(5.0),
                                     minSiliconLayersWithHits = cms.int32(7),
                                     maxD0Significance = cms.double(1000000.0), #fake cut need to 
                                     minPt = cms.double(1.0),
                                     trackQuality = cms.string("any")

Since in CMS many Primary Vertices (PV) can occur each bunch crossing, due to the high interaction rate, vertex fitting algorithms need to be feeded by track cluster which are likely to come from the same PV, in order to give sensible vertex fits.

Two track clusterizer algorithms are available in CMS:

  • the Determistic Annealing Algorithm (DA)
  • the Gap Algorithm (GAP)
The current vertex reconstruction in CMS uses the DA algorithm. The DA algorithm can be configured in order to optimize track clusterization in different PU conditions. The current implementation of the clusterizer in the Primary Vertex Validation Tool uses:

TkClusParameters = cms.PSet(algorithm   = cms.string("DA"),
                                                                  TkDAClusParameters = cms.PSet(
                                                                  coolingFactor = cms.double(0.8),
                                                                  Tmin = cms.double(9.),            #  end of annealing
                                                                  vertexSize = cms.double(0.05),    #  ~ resolution
                                                                  d0CutOff = cms.double(3.),        # downweight high IP tracks 
                                                                  dzCutOff = cms.double(4.)         # outlier rejection after freeze-out

Then for each of the "good" tracks within a track cluster an unbiased refitted Primary Vertex is built by using all the other tracks in the good track list of the each cluster. Once an unbiased vertex is fit track residuals w.r.t the refit PV are evaluated and finally plotted vs the probe track parameters (η,φ).

  • Cartoon illustrating PV Validation principle:

Recipe to run the tool in CMSSW_5_2_3

Check-out from CVS a CMSSW_3_5_1 release, check-out from CVS the appropriate Alignment/OfflineValidation package and compile

scramv1 p CMSSW CMSSW_5_2_3
cd CMSSW_5_2_3/src
addpkg Alignment/OfflineValidation
cp /afs/ Alignment/OfflineValidation/test
cp /afs/ Alignment/OfflineValidation/test
cp /afs/  Alignment/OfflineValidation/test
cp /afs/  Alignment/OfflineValidation/test
cp /afs/  Alignment/OfflineValidation/test
cp /afs/  Alignment/OfflineValidation/test
cp /afs/ Alignment/OfflineValidation/test
cp /afs/ Alignment/OfflineValidation/plugins
cp /afs/ Alignment/OfflineValidation/plugins
scramv1 b

You will need to copy the attached files and MultiPVValidation.h in the Alignment/OfflineValidation/plugins folder of your installation (if you follow the recipe this is already done.)

The attached files PVValidationSubmitter.csh,InputSource.dat, need to be copied in the Alignment/OfflineValidation/test folder of your installation.

Finally the file PlotPVValidation.C has to be copied in the Alignment/OfflineValidation/macros of your folder. Then you can compile via:

And you're ready to validate your geometry.


To run the tool you need to configure it in order to select datasample, track collection, alignment objects/errors, etc.

Go in the Alignment/OfflineValidation/test folder and copy the InputSource.dat file in attachment to this twiki. It has several fields to be configured:

name meaning
jobname name of the job used in lxbatch
isda use or not the DA algorithm
applybows apply or not the tracker module bows
datasetpath name of the file containing the dataset
globaltag GlobalTag to be used in the refit
alignobj afs location of the geometry payload
taggeom name of tag of the geometry
bowsobj afs location of the bows payload
tagbows name of the tag of the bows
apeobj afs location of the error payload
tagape name of tag of the APEs
validationtype chooses the validation module, please use always MultiPVValidation
tracktype name of the Trackcollection in <dataset>
outfile name of the output file

The InputSource.dat file in attachment, for example is configured to run on 2012 data using inflated (= default) APEs.

jobname TestPVVal
isda True
applybows True
datasetpath ALCARECOTkAlMinBias_cff
maxevents 50000
globaltag GR_E_V25::All
alignobj frontier://PromptProd/CMS_COND_31X_ALIGNMENT
taggeom TrackerAlignment_2009_v1_express
bowsobj frontier://PromptProd/CMS_COND_310X_ALIGN
tagbows TrackerSurafceDeformations_v1_express
apeobj sqlite_file:/afs/
tagape AlignmentErrors
validationtype MultiPVValidation
tracktype ALCARECOTkAlMinBias
outfile TestPVVal.root

Starting jobs

To start a job you just need to go in your Alignment/OfflineValidation/test folder and after having configured the job via the steering file give:

 ./PVValidationSubmitter.csh <TagFile.dat> <options>  

or to submit it to lxbatch

bsub -o <logfile> -q <queue> PVValidationSubmitter.csh <TagFile.dat> <options>  

the only option (till now) is --dryRun which produces the cfg files and scripts to run but does not submit the job.

The output .log and .root will be saved in a PVValResults directory (which is created if it does not exist yet). while the used cfg will go in an another directory submittedCfg (which is created if it does not exist yet) in your CMSSW_AREA.

Producing plots

To produce plots of the residuals vs. track parameters (phi,eta) you need to compile the PlotPVValidation.C macro in attachment to this twiki.

$ root -l
root [0] .L PlotPVValidation.C++
root [1] PlotPVValidation("filename1.root=legendentry1,filename2.root=legendentry2", .... , number_of_files,"yourCut") 

where yourCut is a ROOT TCut on the quality of the tracks used for filling the residual histograms.

Usage of the script for historic time trend

In your validation area you simply need to run the script from Andrew Whitbeck to automatically create the jobs split by run/day script passing it five arguments:


The first two can be either a date or a run. The third should be the name of the dataset for which you'll use. The fourth must be either "run" or "date" depending on whether you want to validate run by run or day by day (inputs 1/2 should match). The last is just to tag all of the relevant file which will be created. The script needs these two auxiliary inputs,runreg.cfg

Running on MC

Since the configuration to run on MC is different from the one to run on data (Trigger bits and on the Physics Declared bit) you can find cfg and scripts to run also on MC (in the example Ideally aligned detector) in /afs/

In /afs/ a set of systematic misalignments is present to test the tool.

Some Results on misaligned MC

  • Output of the Validation tool on some Systematic Pixel Misalignment:

Introductory presentations

Presentations at the Tracker Alignment Meeting:

Presentation to the Tracker DPG

-- MarcoMusich - 10-Mar-2010
Topic attachments
I Attachment History Action Size Date Who Comment
Unknown file formatdat InputSource.dat r1 manage 0.5 K 2012-09-24 - 11:48 MarcoMusich Files needed to run the Primary Vertex Validation
Unknown file formatcc r1 manage 24.1 K 2012-09-24 - 11:48 MarcoMusich Files needed to run the Primary Vertex Validation
Header fileh MultiPVValidation.h r1 manage 6.0 K 2012-09-24 - 11:48 MarcoMusich Files needed to run the Primary Vertex Validation
Texttxt r1 manage 11.1 K 2012-09-24 - 11:48 MarcoMusich Files needed to run the Primary Vertex Validation
Unknown file formatcsh PVValidationSubmitter.csh r1 manage 4.7 K 2012-09-24 - 11:48 MarcoMusich Files needed to run the Primary Vertex Validation
C source code filec PlotPVValidation.C r1 manage 42.5 K 2012-09-24 - 11:48 MarcoMusich Files needed to run the Primary Vertex Validation
Unix shell scriptsh r1 manage 4.9 K 2012-09-24 - 17:10 MarcoMusich Script from Andrew Whitbeck for automatically create the jobs split by run/day
Unix shell scriptsh r1 manage 5.1 K 2012-09-24 - 17:36 MarcoMusich auxiliary for
Unknown file formatcfg runreg.cfg r1 manage 1.7 K 2012-09-24 - 17:36 MarcoMusich auxiliary for
Edit | Attach | Watch | Print version | History: r11 < r10 < r9 < r8 < r7 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r11 - 2013-05-10 - PalHidas
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback