Inclusive Single Diffraction at ATLAS


  1. To measure total inclusive single proton dissociation cross-section.
  2. To measure differential inclusive single proton dissociation cross section as a function of:
    • mass of dissociated proton ()
    • proton energy loss ()


Single proton dissociation is process with relatively high cross section at LHC, yet it is not well constrained. There have been no measurements of differential cross-section as a function of mass of dissociated proton (), fraction of dissociated proton energy loss () nor transfer of non-dissociated proton energy ().

Measurements with rapidity gaps can only be done with no pile-up. The correlation between rapidity gap and is not sufficiently well understood to unfold the cross section to . Moreover, rapidity gap measurements suffer from large uncertainties caused by double dissociative and non-diffractive backgrounds.

This is the first measurement of inclusive single proton dissociation with proton tagging at ATLAS, which shall provide more precise measurement of the total cross-section than rapidity gap. Additionally differential cross-sections ( and ) will be measured for the first time.


The analysis will use run number 206881 data. Basic run information are presented in the table below.

run no. 206881
E = 8 TeV
L = 24.11 nb-1
ALFA pots at 9.5 σ (about 8 mm+-)

Data from run 191373 (7 TeV) will not be used, because of smaller luminosity and different trigger configuration. Because of different trigger configurations and detector positions both runs require separate analyses and 8 TeV run seems to be more promissing as far as diffractive analysis is concerned.


Measure scattered proton

Identify scattered proton with ALFA detector

Develop or adopt proton identification algorithm
Implement the chosen algorithm
Verify the identification algorithm in diffractive interactions

Reconstruct scattered proton momentum

Develop or adopt proton momentum reconstruction algorithm
Implement the algorithm
Verify the algorithm in diffractive interactions

Measure activity in the central detector

Identify and select signal events

Remove events with noise
Verify signal selection

Combine information from central detector and ALFA

Combine information from both detectors for each bunch crossing

Find a method of assigning ALFA to ATLAS data
Verify the combination method

Verify if the reconstructed proton originates from the same interaction as signal in central detector

Develop a method of proton assignment to a specific vertex (is it needed?)

Develop an algorithm
Implement the algorithm
Verify the algorithm

Select single diffractive events with signal in central detector and tagged proton

Develop needed selection criteria

Verify the selection criteria

Investigate possible backgrounds

Measure total inclusive cross-section for single diffraction

Investigate systematic uncertainties of total cross-section

To Do List

Set up software and test it with the analysis of rapidity gaps

Repeating the rapidity gap measurement with and without proton tagging on ALFA data will be a good test of the software and shall allow more precise single proton dissociation measurement by reducing uncertainties caused by double dissociation and non-diffractive backgrounds.

Data (TODO)

The analysis will use run 206881 (see data description in the main analysis part). The D3PDs that were prepared earlier by MinBias group and are described in ALFA data and MC samples page can be used for tests, but there may be wrong conditions settings that is why new samples are needed.

Generate new data samples (TODO)

Petr Hammal probably has Oldrich Kepka's scripts for making of D3PDs.


There is no MC with ALFA. It should be generated based on the MC used in rapidity gap analysis, but with changed beam energy.

Probably, scripts prepared by Oldrich Kepka can be used to generate MC. Petr Hamal has the scripts and knows how to run them. He will give us instructions.

Generate test MC with Oldrich scripts according to Petr instructions (TODO)

Software (TODO)

ALERT! WARNING! ALERT! This is run 1 D3PD analysis, so use RootCore Base ver. 1.5.9, not newer. The new versions are for xAOD analyses.


Set up software environment for the first time

Set up ATLAS environment

ATLAS environment is needed. If there is access to CVMFS it can be set up with the following commands:

export ATLAS_LOCAL_ROOT_BASE=/cvmfs/
source /cvmfs/
#export CERN_USER="user_name"

Set up RootCore

Use commnad:
rcSetup Base,1.5.9

Checkout packages from GIT

with Kerberos ticket:
git clone

with LDAP authorisation

git clone

Set up software environment for in everyday work

Set up RootCore

Enter directory where RootCore was installed (the one with file) and source script:



TChain and ALFAReader

In order to use ALFAReader with TChain an entry number corresponding to the current tree (not whole chain) must be selected. It can be done with the code presented below

TChain t_data ("MinBiasTree");
for (Long64_t eventIndex = 0; eventIndex < nEvents; ++eventIndex) {
  Long64_t eventIndexTree= t_data.LoadTree( eventIndex );
Full example is available in test directory of the ALFA Reader
To run the example with bare root run macro
To run the example with RootCore run macro

Set up SVN repository for tests (DONE)

A repository can be downloaded from:

svn co svn+ssh://

ALERT! WARNING! ALERT! Check-out the package using SVN from platform that will be used to compile and run RootCore. In SL6 there is old version of SVN and it will not work with files checked-out with newer version.

Set up empty analysis on test data sample (IN PROGRESS)

Sample handler
SmapleHandler's directory scanner seems not to work locally. Files can be added to SampleHandler with:
SH::SampleHandler sh;
SH::DiskListLocal list ("/path/to/data/");
SH::scanFiles (sh, list);

Proper TTree name must be set:

sh.setMetaString ("nc_tree", "CollectionTree"); 

It seems that D3PDreader should be generated. It can be done after setting up athena environment e.g.

asetup 19.2.0

and then calling application to generate D3PDReader: d3pdReadersFromFile.exe

d3pdReadersFromFile.exe -f ../user.sedwardg.000493.EXT0._00124.NTUP_MINBIAS.root -n DiffractiveEvent

ALERT! WARNING! ALERT! Do NOT name packages "EVENT" and "D3PDReader", because they will be interpreted by RootCore as standard D3PDreader, which contains more information than Diffractive reader. This will cause compilation problems.

With generated source files RootCore package can be easily generated with -p DiffractiveReader *.h *.icc *.cxx

ALERT! WARNING! ALERT! Two files shall be manually substituted with:

svn export svn+ssh:// .
svn export svn+ssh:// .
The files are in directories "D3PDReader" and "Root". Moreover, the file Root/D3PDPerfStats.cxx must be edited. Paths of two includes at the top of the file shall be changed to the name of diffractive D3PDReader.

ALERT! WARNING! ALERT! Important technical comments and remarks:

  1. Automatically generated classes: PrimaryVertexD3PDObject, PrimaryVertexD3PDObject, PrimaryVertexD3PDObject produce memory error, probably when destroyed. Thus objects of these classes shall be removed from the DiffractiveEvent.
  2. In the analysis class a reader object must be dynamically allocated (it must be pointer) and it cannot be deleted. It is also impossible to use std::auto_ptr. However other members e.g. histograms can be static objects.

Add rapidity gap tools to the analysis (TODO)

Add ALFA proton tagging (TODO)

Set up new RootCore package (TODO)

Design proton reconstruction package (TODO)

Features (TODO)

  1. Provide information about number of reconstructed protons in different quality categories
  2. Reconstruct protons momenta.

Implement proton reconstruction (TODO)

Analysis (TODO)

Follow rapidity gap analysis model.



-- GrzegorzPawelGach - 02 Oct 2014

Edit | Attach | Watch | Print version | History: r15 < r14 < r13 < r12 < r11 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r15 - 2014-11-07 - GrzegorzPawelGach
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback