PDF Systematics Study at HU Berlin

This page is under construction

This Twiki-page is to give some more or less detailed information about the analysis to estimate systematic uncertainties coming from a choice of parton distribution functions (PDF) in the Pythia6 generator to produce signal samples in an analysis to search for 4th generation SM-like heavy quarks, b' and t'.


Partonic distribution functions (PDFs) along with parton-parton interaction cross sections determine hadronic cross sections of processes in proton-proton collisions according to the formula,

\[ \sigma^{pp\rightarrow X,Y,...}(...)=\sum_{i_A,j_B=q(g),\bar{q}(g)} \int dx_1dx_2 f^A_i(fl_1,x_1,Q^2) f^B_j(fl_2,x_2,Q^2)\sigma^{i_A j_B \rightarrow X,Y,...}(x_1,x_2,\alpha_s,...). \]

where $\sigma^{i_aj_B\rightarrow X,Y}$ is a parton-parton cross section induced by $i_A$-th and $j_B$-th partons interaction in incoming protons A and B, respectively, $f_i^A(...)$ and $f_j^B(...)$ are $i$-th and $j$-th parton distribution functions (PDF) in the A-th and B-th protons, correspondingly, with $fl_1$ and $fl_2$ to be the flavor of the interacting partons, $x_1$ and $x_2$ momentum fraction of each interacting parton in incoming nucleons.

Since the PDFs are not calculable from the first principles but are determined experimentally with some errors (which come from experimental measurements, theoretical models used to extract PDFs etc.), inclusion of any PDF into the cross section calculation introduces an additional systematic error. Thus, in this study we want to estimate the effect of a PDF change in the Pythia6 generator (used to produce signal samples) on a signal event selection efficiency.

Analysis methods

Our analysis to estimate PDF uncertainties is done using two different methods, re-weighting method and a direct method. In the following shortly the idea of both methods is outlined.

Reweighting method

Suppose, we have MC signal events generated using PDF1 in a generator. If we apply selection cuts to these events, then the signal event selection efficiency, $\epsilon^1$, by definition is,

\[ \epsilon^1 = \frac{N^1_{cuts}}{N^1_{gen}}, \]

where $N^1_{gen}$ is the total number of events generated by the generator before applying any cut and $N^1_{cuts}$ is the number of selected signal events after applying all selection criteria. If an event filter of an efficiency $\epsilon^1_{gen}$ is applied on the generator, then the event selection efficiency will look like

\[ \epsilon^1 = \epsilon^1_{gen} \frac{N^1_{cuts}}{N^1_{gen}}. \]

Now, had these events been generated with the same generator but using another PDF (call it PDF2) then the relative probability of producing a particular event $e_i$ as a result of an interaction of two partons with the same flavor, momentum fraction and energy scale, as in the case of using the PDF1, will be defined, as it is obvious from the formula above, by the relative PDFs weight, $w^i$, defined as

\[ w^i=\frac{f_{PDF2}(fl_1,x_1,Q^2)}{f_{PDF1}(fl_1,x_1,Q^2)} \times \frac{f_{PDF2}(fl_2,x_2,Q^2)}{f_{PDF1}(fl_2,x_2,Q^2)}. \]

Now the $N^2_{cuts}$ and $N^2_{gen}$ for the PDF2 will be defined as $N^2_{cuts} = \sum^{N^1_{cuts}}_{i=1}w^i $ and $N^2_{gen} = \sum^{N^1_{gen}}_{i=1}w^i $ and the event selection efficiency would look like

\[ \epsilon^2 = \epsilon^2_{gen} \frac{N^2_{cuts}}{N^2_{gen}} = \epsilon^2_{gen} \frac{ \sum^{N^1_{cuts}}_{i=1}w^i }{ \sum^{N^1_{gen}}_{i=1}w^i}. \]

The generator filter efficiencies $\epsilon^1_{gen}$ and $\epsilon^2_{gen}$ have to be determined by separate runs of the generator with the both PDFs, PDF1 and PDF2.

So, the role of the PDF reweighting tool is to:

  • extract the PDF info, $f_{PDF1}(f_1,x_1,Q^2)$ and $f_{PDF1}(f_2,x_2,Q^2)$, from the available sample(s);
  • extract the new PDF info,$f_{PDF2}(f_1,x_1,Q^2)$ and $f_{PDF2}(f_2,x_2,Q^2)$, from a new PDF set (one can go here for that http://lhapdf.hepforge.org/pdfsets );
  • calculate the relative event weight, $w^i$.

Applying of the weights to the events in the sample(s) to be reweighted is usually not done withing the PDF-reweighting tool, it is a business of the person who does the reweighting.

From the technical point of view there are three (ATLAS??) tools available to implement PDF-reweighting method, which can be found following the links:

Some additional relevant info on the re-weighing method you can get from

In our analysis for PDF reweighting we use TopPdfUncertainty tool from the Top group.

Direct method

In the Direct method one has to generate new signal event samples including into a generator not only other PDF(s) (different from the default one) but also tunes corresponding to these new PDF(s) to account for Underlying Event(UE), which in generator in general depends on the PDF used to run the generator and after to go through the whole chain of the sample production, i.e. simulation, digitization, reconstruction and production of D3PDs for the final analysis. The advantage of this method over re-weighting is that in this case one counts for the possible effects coming from UE, in contrast to the previous method. On the other hand, the main disadvantage of this method is, it's time and cpu consuming.

PDFs and tunes

This PDF systematic uncertainty study is done using MC samples produced within MC11b ATLAS production setup. In this production MC samples generated with Pythia6 generator have been obtained using MRST LO** (20651) as the default PDF along with its recent "ATLAS Underlying Event Tune 2B" (AUET2B) tune.

In the both methods outlined above we have used the following PDFs to estimate the uncerntainty:

  • CTEQ6L1 (10042) - LO with LO alpha_s;
  • MSTW2008LO (21000)
  • MRST LO* (20650)
  • CT09MC2 (10772) - 1-loop alphas; momentum sum rule violation;
For more details on the PDFs go to LHAPDF.

Reason for choosing the above PDFs: MRST LO** is a LO PDF, so the above PDFs, chosen for the PDF systematics study, should also be of LO.

Pythia6 parameters tuned in the the AUET2B tuning campain are as follows:

Paremeters PARP(62), PARP(64) and PARP(72) are responsible for shower parametrization, namely,

  • PARP(62) defines an initial state radiation (ISR) $p_T$ cut-off;
  • PARP(64) - ISR scale factor on $\alpha_S$ evaluation scale and;
  • PARP(72) is $\Lambda_{QCD}$ for FSR showering from ISR parton emissions.

Paremeters PARP(77), PARP(78), PARP(82), PARP(84) and PARP(90) difene MPI model parameters with

  • PARP(77) being responsible for a suppression of colour reconnection for high-$p_T$ strings;
  • PARP(78) - for the strength of colour reconnection;
  • PARP(82) - MPI $p_T$ cutoff at the nominal reference energy of 1800 GeV;
  • PARP(84) - fractional radius of core part of double-Gaussian transverse proton matter distribution and;
  • PARP(90) - exponent governing the rate of increase of the p0_T MPI cutoff as a function of $\sqrt{s}$.

For more info the AUET2B tune have a look at the ATLAS note ATL-PHYS-PUB-2011-009.

Sample Production

MC signal samples (4th generation b' quark pair production) used in this study have been produced with ATLAS MC production setup MC11c using the tags combination - e972_s1310_s1300_r3043_r2993_p834. The samples have been produced privately and following official production instructions.

Athena releases used for different production steps are the following:

  • Event generation AtlasProduction, - 16.6.7.X (tag - e972);
  • G4 simulation AtlasProduction, - 16.6.7.X (tag - s1310, merging - s1300);
  • Digitization and Reconst AtlasProduction, and newer (tag - r3043, merging - r2993);
  • TopD3PD production - AtlasPhysics, (tag - p834);

For more information on the MC11 production setup have a look at the link.


The following samples have been used to get default efficiencies (i.e. with default MRST LO** PDF) for both direct and re-weighting methods as well as used in the re-weighting method to get shifted efficiencies:

Mass point of 400 GeV:

  • mc11_7TeV.119314.Pythia_d4PairToWtWtbar_400_1LepIncl.merge.NTUP_TOP.e972_s1310_s1300_r3043_r2993_p834
Mass point of 500 GeV:
  • user.mandry.MC11.115114.Pythia_d4PairToWtWtbar_500_1LepIncl.ntup_top.s130_r3043_p834.17.01.2012/

Next samples have been produced with non-default PDFs listed above and used to extract efficiency shifts for the corresponding PDFs for the mass point of b' of 500 GeV:

  • user.mandry.MC11.100001.Pythia_d4PairToWtWtbar_500_1LepIncl_CTEQ6L1.ntup_top.s130_r3043_p834.24.01.2012/
  • user.mandry.MC11.100001.Pythia_d4PairToWtWtbar_500_1LepIncl_CT09MC2.ntup_top.s130_r3043_p834.24.01.2012/
  • user.mandry.MC11.100001.Pythia_d4PairToWtWtbar_500_1LepIncl_MSTW2008LO.ntup_top.s130_r3043_p834.24.01.2012/
  • user.mandry.MC11.100001.Pythia_d4PairToWtWtbar_500_1LepIncl_MRSTLOStar.ntup_top.s130_r3043_p834.24.01.2012/

Extraction of efficiencies

Reweighting method

To implement this method we use default signal heavy b' ?? samples to re-weight them.

From the technical point of view there are three ATLAS? tools available to implement PDF-reweighting method, which can be found following the links:

Some additional relevant info on the re-weighing method you can get from

In our analysis for PDF reweighting we use TopPdfUncertainty tool from the Top group.

Direct method

Discussion of the results

Edit | Attach | Watch | Print version | History: r19 < r18 < r17 < r16 < r15 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r19 - 2012-07-10 - MalikAliev
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback