Difference: IctpTutorial (1 vs. 32)

Revision 312008-12-12 - KerimSuruliz

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 582 to 582
 

Running jobs on the grid with Ganga

Changed:
<
<
First set up Athena in the usual way. Then from the run directory of UserAnalysis type:
>
>
First set up Athena in the usual way. Then from the run directory of Z_Analysis type:
 
source /afs/cern.ch/sw/ganga/install/etc/setup-atlas.sh  (or .csh)
Line: 600 to 600
 j.application.exclude_from_user_area=["*.o","*.root*","*.exe"] j.application.prepare(athena_compile=False) j.application.atlas_release='14.2.21'
Changed:
<
<
j.application.option_file='$HOME/testarea/14.2.21/PhysicsAnalysis/AnalysisCommon/UserAnalysis/share/AnalysisSkeleton_topOptions_localAOD.py' j.application.max_events='1000000'
>
>
j.application.option_file='$HOME/testarea/14.2.21/IctpTutorial/run/jobOptions_Z_Analysis.py' j.application.max_events='100000'
 j.splitter=AthenaSplitterJob() j.splitter.numsubjobs=20 j.inputdata=DQ2Dataset()
Changed:
<
<
j.inputdata.dataset="mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r541"
>
>
j.inputdata.dataset="valid1.005144.PythiaZee.recon.AOD.e322_s412_r583"
 j.inputdata.type='DQ2_LOCAL'

j.outputdata=DQ2OutputDataset()

Changed:
<
<
j.outputdata.outputdata=['AnalysisSkeleton.aan.root']
>
>
j.outputdata.outputdata=['ZAnalysis_ntuple.root']
 j.outputdata.datasetname='Zjets_v14_test'

j.backend=LCG()

Line: 623 to 623
 The lines
j.outputdata=DQ2OutputDataset()
Changed:
<
<
j.outputdata.outputdata=['AnalysisSkeleton.aan.root']
>
>
j.outputdata.outputdata=['ZAnalysis_ntuple.root']
 j.outputdata.datasetname='Zjets_v14_test' tell ganga to save the output in a dq2 dataset, which will be called
user08.<yourUserName>.Zjets_v14_test
Line: 632 to 632
 j.backend.requirements.sites= ['NAPOLI']
Changed:
<
<
Now copy/paste this script into your favourite editor and save the file as
gangascript.py
in the
run
subdirectory of
UserAnalysis
. Then run ganga as before and once it's started up, type:
>
>
Now copy/paste this script into your favourite editor and save the file as gangascript.py in the run subdirectory of Z_Analysis. Then run ganga as before and once it's started up, type:
 
execfile('gangascript.py')
Line: 651 to 652
 dq2-get -H /tmp/ user08..Zjets_v14_test
Changed:
<
<
dq2 will now retrieve the
AnalysisSkeleton.aan.root
files from each individual subjob and and put them
>
>
dq2 will now retrieve the
ZAnalysis_ntuple.root
files from each individual subjob and and put them
 in the local /tmp/ directory. The log files from the run, including the athena log files for each separate subjob, may be found in the directory
$HOME/UserName/Local/<job number>/output/
.
Line: 660 to 661
 

Running jobs on the grid with pathena/PANDA

First set up Athena as usual. Then type:
Changed:
<
<
cd ${HOME}/scratch0/IctpTutorial/14.2.21
>
>
cd ${HOME}/IctpTutorial/14.2.21
 cmt co PhysicsAnalysis/DistributedAnalysis/PandaTools cd PhysicsAnalysis/DistributedAnalysis/PandaTools/cmt cmt config
Line: 680 to 681
 Now, to run a job with pathena, we run the command:
Changed:
<
<
pathena --split 20 --inDS mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r541 --outDS user08..ttbar_v14_pathenatest AnalysisSkeleton_topOptions_localAOD.py
>
>
pathena --split 20 --inDS valid1.005144.PythiaZee.recon.AOD.e322_s412_r583 --outDS user08..Zjets_v14_pathenatest jobOptions_Z_Analysis.py
 

By default, jobs are sent to the BNL site, ANALY_BNL_ATLAS_1. This can be changed by adding a

--site==SITE
option.

Revision 292008-12-10 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 264 to 264
  if(name == "") continue; // _trigDec->isPassed(name) ===> this method will tell you if the corresponding trigger signature called "name" has been fired or not (it has boolean type) }
Added:
>
>
// Get all configured chain names from config const std::vector< const TrigConf::HLTChain * > confChains = _trigDec->getConfigurationChains(); std::vector< const TrigConf::HLTChain *>::const_iterator iter; for (iter = confChains.begin(); iter = confChains.end(); ++iter){ if(*iter)continue; std::string name = (*iter)->chain_name(); std::string tmp_trigLevel = name.substr(0,3); float prescale = (*iter)->prescale(); if(name == "") continue; if (tmp_trigLevel=="L2_") {

} else if (tmp_trigLevel=="EF_") {

} }

 
Added:
>
>
  • in the jobOption file, add:

############################# Set up trigger configuration service and metadata service is relies on, for analysis job without RecExCommon
from AthenaCommon.GlobalFlags import GlobalFlags
GlobalFlags.DetGeo.set_atlas()
import IOVDbSvc.IOVDb
from IOVDbSvc.CondDB import conddb
conddb.addFolder("TRIGGER","/TRIGGER/HLT/Menu <tag>HEAD</tag>")
conddb.addFolder("TRIGGER","/TRIGGER/HLT/HltConfigKeys <tag>HEAD</tag>")
conddb.addFolder("TRIGGER","/TRIGGER/LVL1/Lvl1ConfigKey <tag>HEAD</tag>")
conddb.addFolder("TRIGGER","/TRIGGER/LVL1/Menu <tag>HEAD</tag>")
conddb.addFolder("TRIGGER","/TRIGGER/LVL1/Prescales <tag>HEAD</tag>")


## set up trigger decision tool
from TrigDecision.TrigDecisionConf import TrigDec__TrigDecisionTool
tdt = TrigDec__TrigDecisionTool()
ToolSvc += tdt
from RecExConfig.RecFlags  import rec
rec.readAOD=True
from TriggerJobOpts.TriggerFlags import TriggerFlags
TriggerFlags.doTriggerConfigOnly = True
TriggerFlags.configurationSourceList = ['ds']

## setup configuration service
from TrigConfigSvc.TrigConfigSvcConfig import DSConfigSvc
dscfg=DSConfigSvc()
from TrigConfigSvc.TrigConfigSvcConfig import SetupTrigConfigSvc
trigcfg = SetupTrigConfigSvc()
trigcfg.SetStates("ds")
trigcfg.InitialiseSvc()

################################## END of trigger setup
 
Added:
>
>
 

Session 3: Basic Tools Used in a Physics Analysis (2)

Basic Objects (Outputs of the reconstruction)

Revision 282008-12-10 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 21 to 21
 
  • Create a directory called cmthome: " mkdir cmthome"
  • Create a directory called IctpTutorial: "
    mkdir IctpTutorial
  • Create a directory under IctpTutorial which corresponds to the ATLAS release being used:
    mkdir IctpTutorial/14.2.21
    . This will hold your Algorithms and any checked out code .
Changed:
<
<
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirements ${HOME}/cmthome/
    where the requirements file is needed to define athena release version together with the necessary properties:
>
>
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirements ${HOME}/cmthome/
    where the requirements file is needed to define athena release version together with the necessary environment (CAUTION: you need to make sure that $HOME/scratch0 is replaced by $HOME in requirements and the script run_setup.sh):
 
 
   set CMTSITE CERN
   set SITEROOT /afs/cern.ch
Line: 58 to 58
 Let's look for a particular dataset: valid1.005144.PythiaZee.recon.AOD.e322_s412_r583, this is a validation dataset. Now, to retrieve what containers are stored in the dataset, one can use the python file "checkFile.py" which can be obtained using get_file:
 get_files checkFile.py  
.
Changed:
<
<
Get the file for CASTOR, rfcp /castor/cern.ch/user/c/chaouki/IctpTutorial/AOD.029110._00001.pool.root.1 /tmp/$USER/.
>
>
Get the file from CASTOR, rfcp /castor/cern.ch/user/c/chaouki/IctpTutorial/AOD.029110._00001.pool.root.1 /tmp/$USER/.
 Then, run it:
 ./checkFile.py /tmp/$USER/AOD.029110._00001.pool.root.1 
Line: 162 to 162
 The output would be a root file which contains a tree with one leaf representing the Z invariant mass. thumbs up
Added:
>
>
The AOD file to run on, is included in the jobOption file:

import AthenaPoolCnvSvc.ReadAthenaPool
ServiceMgr.EventSelector.InputCollections = ["rfio:/castor/cern.ch/user/c/chaouki/IctpTutorial/AOD.029110._00001.pool.root.1"]

 

Session 2: Basic Tools used in a Physics Analysis (1)

Monte-Carlo Information

Line: 243 to 252
 ToolHandle<TrigDec::TrigDecisionTool> _trigDec;
Changed:
<
<
  • In the execute() method for each event:
>
>
  • In the .cxx file, add the following in Z_ee() function (which is called in execute() method):
 
sc = _trigDec.retrieve();
// retrieve all TriggerDecision objects

Revision 262008-12-09 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 21 to 21
 
  • Create a directory called cmthome: " mkdir cmthome"
  • Create a directory called IctpTutorial: "
    mkdir IctpTutorial
  • Create a directory under IctpTutorial which corresponds to the ATLAS release being used:
    mkdir IctpTutorial/14.2.21
    . This will hold your Algorithms and any checked out code .
Changed:
<
<
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirement ${HOME}/cmthome/
    where the requirements file is needed to define athena release version together with the necessary properties:
>
>
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirements ${HOME}/cmthome/
    where the requirements file is needed to define athena release version together with the necessary properties:
 
 
   set CMTSITE CERN
   set SITEROOT /afs/cern.ch
Line: 58 to 58
 Let's look for a particular dataset: valid1.005144.PythiaZee.recon.AOD.e322_s412_r583, this is a validation dataset. Now, to retrieve what containers are stored in the dataset, one can use the python file "checkFile.py" which can be obtained using get_file:
 get_files checkFile.py  
.
Added:
>
>
Get the file for CASTOR, rfcp /castor/cern.ch/user/c/chaouki/IctpTutorial/AOD.029110._00001.pool.root.1 /tmp/$USER/.
 Then, run it:
Changed:
<
<
 ./checkFile.py AOD.029110._00001.pool.root.1 
>
>
 ./checkFile.py /tmp/$USER/AOD.029110._00001.pool.root.1 
 

General Information about ATLAS data

Line: 145 to 146
 

Start Running a simple job (Z(ee) reconstruction):

Now, that we know the objects of interests, we will use the Z -->

ee MC sample and reconstruct the resonance invariant mass.
Changed:
<
<
You can copy the following code ~chaouki/scratch0/IctpTutorial/14.2.21/Z_Analysis to your area $HOME/scratch0/IctpTutorial/14.2.21.
>
>
You can copy the following code ~chaouki/scratch0/IctpTutorial/14.2.21/Z_Analysis to your area $HOME/IctpTutorial/14.2.21.
 Then follow the following steps to configure and compile the code:
  • In Z_Analysis/cmt, do the following:

Revision 252008-12-09 - RachidMazini

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 73 to 73
 
  • Full output of reconstruction in object (POOL/ROOT) format:
    • Tracks (and their hits), Calo Clusters, Calo Cells, combined reconstruction objects etc.
  • Nominal size 1 MB/event initially, to decrease as the understanding of the detector improves
Changed:
<
<
    • Compromise between “being able to do everything on the ESD” and “not enough disk space to store too large events”
>
>
    • Compromise between “being able to do everything on the ESD and not enough disk space to store too large events.
  AOD (Analysis Object Data):
  • Summary of event reconstruction with “physics” (POOL/ROOT) objects:
Line: 132 to 132
  * Write down the tag in the configurationTag field * Click on Interpret
  • Using the Atlas Production page: click on Production Tags, then write the tag in the field, and click Go.
Changed:
<
<
  • Using the Panda Monitor: do a Tasks - search, write down the tag in the Configuration Tag field, click on any task ID that will appear. The first line is something like:
Transformation tags: e364 s462 Interpret tags and show transformation configuration (active link). Then Click on the active link.
>
>
  • Using the Panda Monitor: do a Tasks - search, write down the tag in the Configuration Tag field, click on any task ID that will appear. Then Click on the active link to get info about the production configuration..
 

Using AMI

Revision 242008-12-09 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 12 to 12
 
  • Setup a particular version of the ATLAS software releases.
  • Understand various physics analysis objects (how to access more information about each object).
  • Use the physics objects in an AOD analysis.
Changed:
<
<
  • Use the GRID (PAnda, Ganga) to generate MC samples, analyze data, etc...
>
>
  • Use the GRID (PAnda, Ganga) to analyze data, etc...
 

Session 1:

Setup your work area on lxplus

Changed:
<
<
In your scratch directory (${HOME}/scratch0/):
>
>
In your home directory (${HOME}/):
 
  • Create a directory called cmthome: " mkdir cmthome"
  • Create a directory called IctpTutorial: "
    mkdir IctpTutorial
  • Create a directory under IctpTutorial which corresponds to the ATLAS release being used:
    mkdir IctpTutorial/14.2.21
    . This will hold your Algorithms and any checked out code .
Changed:
<
<
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirement ${HOME}/scratch0/cmthome/
    where the requirements file is needed to define athena release version together with the necessary properties:
>
>
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirement ${HOME}/cmthome/
    where the requirements file is needed to define athena release version together with the necessary properties:
 
 
   set CMTSITE CERN
   set SITEROOT /afs/cern.ch
   macro ATLAS_DIST_AREA ${SITEROOT}/atlas/software/dist
   # use optimised version by default
Changed:
<
<
macro ATLAS_TEST_AREA "${HOME}/scratch0/IctpTutorial/14.2.21" 14.2.21 "${HOME}/scratch0/IctpTutorial/14.2.21"
>
>
macro ATLAS_TEST_AREA "${HOME}/IctpTutorial/14.2.21" 14.2.21 "${HOME}/IctpTutorial/14.2.21"
  use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA)
  • Copy and run the script cmt_version.sh, which sets up CMT.

Changed:
<
<
cp ~chaouki/scratch0/cmthome/cmt_version.sh ${HOME}/scratch0/cmthome/ cd ${HOME}/scratch0/cmthome/
>
>
cp ~chaouki/scratch0/cmthome/cmt_version.sh ${HOME}/cmthome/ cd ${HOME}/cmthome/
  source cmt_version.sh TIP this command needs to be done only once.
  • Finally copy and run the script run_setup.sh to setup the release environment

Changed:
<
<
cp ~chaouki/scratch0/cmthome/run_setup.sh ${HOME}/scratch0/cmthome/
>
>
cp ~chaouki/scratch0/cmthome/run_setup.sh ${HOME}/cmthome/
  source run_setup.sh TIP this command needs to be done every time you open a new shell/xterm.
Line: 503 to 503
 Your assignment is to include more variables into the tree, try to clean the invariant mass plot using a combination of cuts. The files to modify are:
  • Z_Analysis/src/Z_Analysis.cxx
  • Z_Analysis/Z_Analysis/Z_Analysis.h
Deleted:
<
<

Extract the PID Efficiency

 

Session 4: Running jobs on the grid with GANGA and pathena/PANDA

Note: following this session will require you to have a valid grid certificate.

Revision 232008-12-09 - RachidMazini

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 63 to 63
 

General Information about ATLAS data

Added:
>
>

ATLAS data Model

ATLAS uses several formats, designed to handle in a efficient way a large and diverse amount of information, ranging from raw data collected with the detector to the final reconstructed objects used in the physics analysis

RAW : * ByteStream format, ~1.6 MB/event

ESD (Event Summary Data):

  • Full output of reconstruction in object (POOL/ROOT) format:
    • Tracks (and their hits), Calo Clusters, Calo Cells, combined reconstruction objects etc.
  • Nominal size 1 MB/event initially, to decrease as the understanding of the detector improves
    • Compromise between “being able to do everything on the ESD” and “not enough disk space to store too large events”

AOD (Analysis Object Data):

  • Summary of event reconstruction with “physics” (POOL/ROOT) objects:
    • Contains electrons, muons, jets, etc.
  • Nominal size 100 kB/event (now 200 kB/event including MC truth)

DPD (Derived Physics Data):

  • Skimmed/slimmed/thinned events + other useful “user” data derived from AODs and conditions data
  • Nominally 10 kB/event on average
    • Large variations depending on physics channels

TAG :

  • Database (or ROOT files) used to quickly select events in AOD and/or ESD files
 

Dataset Naming convention

Changed:
<
<
A run number is unique and refers to a sample that contains particular physics. It can be assigned to either a Monte Carlo sample or a data sample are it cannot be recycled.
>
>
A run number (or dataset number) is unique and refers to a sample that contains particular physics. It can be assigned to either a Monte Carlo sample or a data sample are it cannot be recycled.

Datasets have the following name format: Project.datasetNumber.physicsShort.prodStep.dataType.TAG. Example: mc08.106300.PythiaH120zz4l.recon.AOD.e352_s462_r541

Project is a string that indicates a production or processing serie, such as mc08, valid1, fdr08_run2...

dataSet is a 6 digits number assigned to the run number.

PhysicsShort is a string that provides a short description of the data.

prodStep descrive the production step, it can have one of the following value * evgen corresponding to event generation. * simul corresponding to events processed by Geant-4 simulation * digit corresponding to digitized events * recon corresponding to outputs from reconstruction

 
Changed:
<
<
Datasets have the following name format: SEQ.AAAAAA.PhysShort.TYPE.TAGS . Example: mc08.106300.PythiaH120zz4l.recon.e352_s462_r541
>
>
dataType describe the data format produced at a particular production step: * EVNT: at event generation * HITS: at detector simulation * RDO: at event digitization * ESD, AOD, DPD : at reconstruction * TAG: at event data tag building * log: logfile produces with each step.
 
Changed:
<
<
SEQ is a string that indicates a production or processing serie, such as mc08, valid1, fdr08_run2... AAAAAA is a 6 digit number assigned to the run number. PhysShort is a string that provides a short discription of the data. TYPE is one of the following (there may be others)
  • evgen.EVNT or evgen.log corresponding to output from event generation.
  • simul.HITS or simul.log corresponding events processed by Geant-4 simulation
  • digit.RDO or digit.log corresponding to digitized events
  • reconESD, recon.AOD or recon.log for outputs from reconstruction
TAGS is a serie of tag describing each step of the production. For MonteCarlo data they are defined as follow:
>
>
TAG is a serie of tags describing each step of the production. For Monte Carlo data they are defined as follow:
 
  • e XXX: evgen configuration
  • s XXX: simulation configuration
  • d XXX: digitization configuration
Line: 101 to 137
 

Using AMI

Changed:
<
<
AMI can also be used to retrieve dataset files and get the full path to access them using DQ tools such as dq-get. As an example, retrieving AOD files in mc08.106300.PythiaH120zz4l.recon.XXX.e352_s462_r541 sample. Go to AMI then to Dataset Search . Fill in mc08.106300.PythiaH120zz4l.recon.%.e352_s462_r541of the dataset in the "datasets search" field and click on it.
>
>
AMI can also be used to retrieve dataset files and get the full path to access them using DQ tools such as dq-get. As an example, retrieving AOD files in mc08.106300.PythiaH120zz4l.recon.XXX.e352_s462_r541 sample. Go to AMI then to Dataset Search . Fill in mc08.106300.PythiaH120zz4l.recon.%.e352_s462_r541in the "datasets search" field and click on it.
 
Changed:
<
<
NB: Use % for wildcarding: example "mc08.106300.PythiaH120zz4l.%" to search for all run 106300 samples produced in the mc08 serie, including event generation, simulation, digitization, reconstruction...
>
>
NB: Use % for wildcard: example "mc08.106300.PythiaH120zz4l.%" to search for all run 106300 samples produced in the mc08 series, including event generation, simulation, digitization, reconstruction...
  Please refer to the AMI Tutorial for more relevant details.

Revision 222008-12-09 - RachidMazini

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 88 to 88
  The digits following the letter indicate a unique software configuration. For example: using a different athena release, or using a different geometry tag, or using a different set of jobOptions fragments requires a new tag.
Changed:
<
<

Searching for ATLAS Dataset and interprettng TAGs

>
>

Searching for ATLAS Dataset and interpreting TAGs

 Tags can be interpreted in three possible ways:

  • Using AMI (recommended). From the main page:
Line: 105 to 105
  NB: Use % for wildcarding: example "mc08.106300.PythiaH120zz4l.%" to search for all run 106300 samples produced in the mc08 serie, including event generation, simulation, digitization, reconstruction...
Added:
>
>
Please refer to the AMI Tutorial for more relevant details.
 

Start Running a simple job (Z(ee) reconstruction):

Now, that we know the objects of interests, we will use the Z -->

ee MC sample and reconstruct the resonance invariant mass.

Revision 212008-12-09 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 61 to 61
 Then, run it:
 ./checkFile.py AOD.029110._00001.pool.root.1 
Deleted:
<
<

Start Running a simple job (Z(ee) reconstruction):

Now, that we know the objects of interests, we will use the Z -->

ee MC sample and reconstruct the resonance invariant mass. You can copy the following code ~chaouki/scratch0/IctpTutorial/14.2.21/Z_Analysis to your area $HOME/scratch0/IctpTutorial/14.2.21. Then follow the following steps to configure and compile the code:
  • In Z_Analysis/cmt, do the following:
    cmt config
    source setup.sh
    make
       
You need to repeat the last two commands whenever you change the code to recompile it.
  • In Z_Analysis/run, do the following:
    athena jobOptions_Z_Analysis.py
       
The output would be a root file which contains a tree with one leaf representing the Z invariant mass. thumbs up
 

General Information about ATLAS data

Changed:
<
<

Dataset Naming convention

>
>

Dataset Naming convention

  A run number is unique and refers to a sample that contains particular physics. It can be assigned to either a Monte Carlo sample or a data sample are it cannot be recycled.

Datasets have the following name format: SEQ.AAAAAA.PhysShort.TYPE.TAGS .

Line: 107 to 88
  The digits following the letter indicate a unique software configuration. For example: using a different athena release, or using a different geometry tag, or using a different set of jobOptions fragments requires a new tag.
Changed:
<
<

Searching for ATLAS Dataset and interprettng TAGs

>
>

Searching for ATLAS Dataset and interprettng TAGs

 Tags can be interpreted in three possible ways:

  • Using AMI (recommended). From the main page:
Line: 118 to 99
 
  • Using the Panda Monitor: do a Tasks - search, write down the tag in the Configuration Tag field, click on any task ID that will appear. The first line is something like:
Transformation tags: e364 s462 Interpret tags and show transformation configuration (active link). Then Click on the active link.
Changed:
<
<

Using AMI

>
>

Using AMI

 AMI can also be used to retrieve dataset files and get the full path to access them using DQ tools such as dq-get. As an example, retrieving AOD files in mc08.106300.PythiaH120zz4l.recon.XXX.e352_s462_r541 sample. Go to AMI then to Dataset Search . Fill in mc08.106300.PythiaH120zz4l.recon.%.e352_s462_r541of the dataset in the "datasets search" field and click on it.

NB: Use % for wildcarding: example "mc08.106300.PythiaH120zz4l.%" to search for all run 106300 samples produced in the mc08 serie, including event generation, simulation, digitization, reconstruction...

Added:
>
>

Start Running a simple job (Z(ee) reconstruction):

Now, that we know the objects of interests, we will use the Z -->

ee MC sample and reconstruct the resonance invariant mass. You can copy the following code ~chaouki/scratch0/IctpTutorial/14.2.21/Z_Analysis to your area $HOME/scratch0/IctpTutorial/14.2.21. Then follow the following steps to configure and compile the code:
  • In Z_Analysis/cmt, do the following:
    cmt config
    source setup.sh
    make
       
You need to repeat the last two commands whenever you change the code to recompile it.
  • In Z_Analysis/run, do the following:
    athena jobOptions_Z_Analysis.py
       
The output would be a root file which contains a tree with one leaf representing the Z invariant mass. thumbs up
 

Session 2: Basic Tools used in a Physics Analysis (1)

Revision 202008-12-09 - RachidMazini

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 80 to 80
 The output would be a root file which contains a tree with one leaf representing the Z invariant mass. thumbs up
Added:
>
>

General Information about ATLAS data

Dataset Naming convention

A run number is unique and refers to a sample that contains particular physics. It can be assigned to either a Monte Carlo sample or a data sample are it cannot be recycled.

Datasets have the following name format: SEQ.AAAAAA.PhysShort.TYPE.TAGS . Example: mc08.106300.PythiaH120zz4l.recon.e352_s462_r541

SEQ is a string that indicates a production or processing serie, such as mc08, valid1, fdr08_run2... AAAAAA is a 6 digit number assigned to the run number. PhysShort is a string that provides a short discription of the data. TYPE is one of the following (there may be others)

  • evgen.EVNT or evgen.log corresponding to output from event generation.
  • simul.HITS or simul.log corresponding events processed by Geant-4 simulation
  • digit.RDO or digit.log corresponding to digitized events
  • reconESD, recon.AOD or recon.log for outputs from reconstruction
TAGS is a serie of tag describing each step of the production. For MonteCarlo data they are defined as follow:
  • e XXX: evgen configuration
  • s XXX: simulation configuration
  • d XXX: digitization configuration
  • r XXX: reconstruction configuration
  • a XXX: atlfast (either I or II) configuration
  • t XXX: tag production configuration
  • b XXX: bytestream production configuration

The digits following the letter indicate a unique software configuration. For example: using a different athena release, or using a different geometry tag, or using a different set of jobOptions fragments requires a new tag.

Searching for ATLAS Dataset and interprettng TAGs

Tags can be interpreted in three possible ways:

  • Using AMI (recommended). From the main page: * Click on Nomenclature * Write down the tag in the configurationTag field * Click on Interpret
  • Using the Atlas Production page: click on Production Tags, then write the tag in the field, and click Go.
  • Using the Panda Monitor: do a Tasks - search, write down the tag in the Configuration Tag field, click on any task ID that will appear. The first line is something like:
Transformation tags: e364 s462 Interpret tags and show transformation configuration (active link). Then Click on the active link.

Using AMI

AMI can also be used to retrieve dataset files and get the full path to access them using DQ tools such as dq-get. As an example, retrieving AOD files in mc08.106300.PythiaH120zz4l.recon.XXX.e352_s462_r541 sample. Go to AMI then to Dataset Search . Fill in mc08.106300.PythiaH120zz4l.recon.%.e352_s462_r541of the dataset in the "datasets search" field and click on it.

NB: Use % for wildcarding: example "mc08.106300.PythiaH120zz4l.%" to search for all run 106300 samples produced in the mc08 serie, including event generation, simulation, digitization, reconstruction...

 

Session 2: Basic Tools used in a Physics Analysis (1)

Monte-Carlo Information

Line: 175 to 219
 }
Deleted:
<
<

General Information about ATLAS data

File Naming convention

A run number is unique and refers to a sample that contains particular physics. It can be assigned to either a Monte Carlo sample or a data sample are it cannot be recycled.

Datasets have the following name format: SEQ.AAAAAA.PhysShort.TYPE.TAGS . Example: mc08.106300.PythiaH120zz4l.recon.e352_s462_r541

SEQ is a string that indicates a production or processing serie, such as mc08, valid1... AAAAAA is a 6 digit number assigned to the run number. PhysShort is a string that provides a short discription of the data. TYPE is one of the following (there may be others)

  • evgen corresponding to generated events
  • simul corresponding events processed by Geant-4
  • digit (or sometimes digi) corresponding to digitized events
  • recon is an output from reconstruction
TAGS is a serie of tag describing each step of the production. For MonteCarlo data they are defined as follow:
  • e XXX: evgen configuration
  • s XXX: simulation configuration
  • d XXX: digitization configuration
  • r XXX: reconstruction configuration
  • a XXX: atlfast (either I or II) configuration
  • t XXX: tag production configuration
  • b XXX: bytestream production configuration

The digits following the letter indicate a unique software configuration. For example: using a different athena release, or using a different geometry tag, or using a different set of jobOptions fragments requires a new tag.

 

Session 3: Basic Tools Used in a Physics Analysis (2)

Revision 192008-12-09 - RachidMazini

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 175 to 175
 }
Added:
>
>

General Information about ATLAS data

File Naming convention

A run number is unique and refers to a sample that contains particular physics. It can be assigned to either a Monte Carlo sample or a data sample are it cannot be recycled.

Datasets have the following name format: SEQ.AAAAAA.PhysShort.TYPE.TAGS . Example: mc08.106300.PythiaH120zz4l.recon.e352_s462_r541

SEQ is a string that indicates a production or processing serie, such as mc08, valid1... AAAAAA is a 6 digit number assigned to the run number. PhysShort is a string that provides a short discription of the data. TYPE is one of the following (there may be others)

  • evgen corresponding to generated events
  • simul corresponding events processed by Geant-4
  • digit (or sometimes digi) corresponding to digitized events
  • recon is an output from reconstruction
TAGS is a serie of tag describing each step of the production. For MonteCarlo data they are defined as follow:
  • e XXX: evgen configuration
  • s XXX: simulation configuration
  • d XXX: digitization configuration
  • r XXX: reconstruction configuration
  • a XXX: atlfast (either I or II) configuration
  • t XXX: tag production configuration
  • b XXX: bytestream production configuration

The digits following the letter indicate a unique software configuration. For example: using a different athena release, or using a different geometry tag, or using a different set of jobOptions fragments requires a new tag.

 

Session 3: Basic Tools Used in a Physics Analysis (2)

Revision 182008-12-08 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 72 to 72
 source setup.sh make
Added:
>
>
You need to repeat the last two commands whenever you change the code to recompile it.
 
  • In Z_Analysis/run, do the following:
    athena jobOptions_Z_Analysis.py
Line: 96 to 97
  In the header file:
Added:
>
>
#include "McParticleEvent/TruthParticleContainer.h"

......

 std::string _truthParticleContainerName; const TruthParticleContainer *_truthList ;
Line: 109 to 116
 
// --- Retrieve truth list ---
sc=_storeGate->retrieve( _truthList, _truthParticleContainerName);
Added:
>
>
double truth_px, truth_py, truth_pz, truth_e; int truth_id, truth_stat, truth_bcode, truth_ndg;
 for (int i=0 ; i<_truthList->size() ; i++) { const TruthParticle* tp = (*_truthList)[i] ; // --- Kinematics info of this truth particle ---
Line: 142 to 153
 As an exercise: extract the LVL1 trigger signatures of the Z(ee) process. You might need to add the following code into the Z_Analysis package:
  • In the header file:
Added:
>
>
#include "TrigDecision/TrigDecisionTool.h"

.......

 /** get a handle to the TrigDecision helper */ ToolHandle<TrigDec::TrigDecisionTool> _trigDec;

Revision 172008-12-08 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 19 to 19
  In your scratch directory (${HOME}/scratch0/):
  • Create a directory called cmthome: " mkdir cmthome"
Changed:
<
<
  • Create a directory called IctpTutorial: "
    mkdir IctpTutorial
  • Create a directory under IctpTutorial which corresponds to the ATLAS release being used:
    mkdir IctpTutorial/14.2.21
    . This will hold your Algorithms and any checked out code .
>
>
  • Create a directory called IctpTutorial: "
    mkdir IctpTutorial
  • Create a directory under IctpTutorial which corresponds to the ATLAS release being used:
    mkdir IctpTutorial/14.2.21
    . This will hold your Algorithms and any checked out code .
 
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirement ${HOME}/scratch0/cmthome/
    where the requirements file is needed to define athena release version together with the necessary properties:
      
       set CMTSITE CERN
Line: 81 to 81
 

Session 2: Basic Tools used in a Physics Analysis (1)

Changed:
<
<

Monte-Carlo Information:

Trigger:

>
>

Monte-Carlo Information

When looking at real data, it might be interesting to make use of theoretical models and model the detector response to the theory prediction. This is very important step to guide the expected discoveries. In fact theory and experiment must work together in order to achieve physics results where doubt is no probability. Therefore, one can in fact generate MC data samples and study the specificity of the models that are in the market. The process of generating MC has 4 main steps:

  • Generate events using a particular model.
  • Simulate the detector response by generating Hits in the different sub-detectors, here we need to know the detector geometry and possible misalignment.
  • Then the digitization phase come to simulate the electronics' response to the Hits.
  • Finally, using the digitization information one can reconstruct the basic quantities which will be described later (this is the only step one has to implement for real data).

The benefit that one gets with MC samples is the fact that the underlying/initial information is stored. That is you know for sure whether an identified object matches an electron, a muon, a tau or a jet using the truth information (hence the name "truth"). Exercise: try to access this information by adding the following lines into the Z_Analysis package:

In the header file:

std::string _truthParticleContainerName;
const TruthParticleContainer *_truthList ;

Then in the cxx file, add the following:

  • In the ctor:
declareProperty("MCParticleContainer", _truthParticleContainerName = "SpclMC");
  • Then for each event in the execute() method, you can retrieve the truth information as follows:
// --- Retrieve truth list ---
sc=_storeGate->retrieve( _truthList, _truthParticleContainerName);
for (int i=0 ; i<_truthList->size() ; i++) {
  const TruthParticle* tp = (*_truthList)[i] ;
  // --- Kinematics info of this truth particle ---
  truth_px = tp->px() ;
  truth_py = tp->py() ;
  truth_pz = tp->pz() ;
  truth_e = tp->e() ;
  truth_id = tp->pdgId() ;
  truth_stat = tp->genParticle()->status() ;
  truth_bcode = tp->genParticle()->barcode() ;
 
  // --- Daughter information ---
  truth_ndg = tp->nDecay() ;  
  for (int j=0 ; j< truth_ndg ; j++) {   
    const TruthParticle* truth_dg = tp->child(j);
  }
}
Your assignment is to print out the truth information of the events generated in the Z(ee) simulation.

Trigger

This is another very crucial part of the data taking process. Based on the effectiveness of the trigger, you might or might not discover your beloved signal. Try to picture this (numbers are not real but reality must of the same order of magnitude): almost 99% of the expected events are not of interest to new physics, hence you are left with almost 1% of potentially interesting data. Now, if something in the trigger does not work well, then you might loose part of the 1% (and may be all). Hence, it is of great importance to make sure that the trigger is highly efficient. In ATLAS, we have two main trigger levels, namely:
  • Level 1 trigger: LVL1.
  • High Level Trigger (HLT) which in turn is divided into two parts:
    • Level 2 trigger: LVL2.
    • Event Filter: EF.

For early data, the aim is to have a fast decision based on the information gathered at 40MHz at L1, which would allow to record data at ~200Hz after the EF (equivalent to store 300MB/s).

As an exercise: extract the LVL1 trigger signatures of the Z(ee) process. You might need to add the following code into the Z_Analysis package:

  • In the header file:
/** get a handle to the TrigDecision helper */
ToolHandle<TrigDec::TrigDecisionTool> _trigDec;

  • In the execute() method for each event:
sc = _trigDec.retrieve();
// retrieve all TriggerDecision objects
const std::vector<const LVL1CTP::Lvl1Item*> L1Items = _trigDec->getL1Items();
std::vector<const LVL1CTP::Lvl1Item*>::const_iterator itItem;   
for(itItem= L1Items.begin(); itItem!= L1Items.end(); ++itItem) {
   if (!*itItem) continue;
   std::string name = (*itItem)->name(); 
   if(name == "") continue;
//   _trigDec->isPassed(name) ===> this method will tell you if the corresponding trigger signature called "name" has been fired or not (it has boolean type)
}
 

Session 3: Basic Tools Used in a Physics Analysis (2)

Changed:
<
<

Basic Objects (Outputs of the reconstruction):

>
>

Basic Objects (Outputs of the reconstruction)

 
Changed:
<
<
At the AOD level, the user will be able to access the basic information about each object needed for physics analysis. Below, we show code snippets which would help you understand the process of retrieving the object's properties/reconstruction variables. In general, for each object we can access the kinematic information namely (charge is available for all charged objects):
>
>
At the AOD level, the user will be able to access the basic information about each object needed for physics analysis. Below, we show code snippets which would help you understand the process of retrieving the object's properties/reconstruction variables. In general, for each object we can access the kinematics information namely (charge is available for all charged objects):
 
Variable Implementation
px (*Itr)->hlv().x()
Line: 107 to 184
 

then retrieve different basic objects, which are defined below:

Changed:
<
<

egamma objects:

>
>

egamma objects

 These are objects identified in the Liquid Argon calorimeter. They are a mixture of electrons and photons. One can distinguish between the two species using the information that the inner detector would provide namely track-shower matching information. Therefore and based on this information, egamma objects are further separated into photon and electron collections.
Changed:
<
<
  • electrons: the electron objects, which are stored in the ElectronContainer, satisfy very loose criteria and can be accessed in the following way.
>
>
  • electrons: the electron objects, which are stored in the ElectronContainer, satisfy very loose criteria and can be accessed in the following way.
 
const ElectronContainer *_elecList ;   
sc=_storeGate->retrieve( _elecList, _electronContainerName);   
Line: 137 to 214
 
Et (Cone 0.30) trkmatch->parameter(egammaParameters::etconoise30)
Changed:
<
<
  • Et isolation in a ring 0.1< DeltaR < D (0.2 or 0.3), above 3 sigma of total noise, available from rel 14.0.0:
>
>
  • Et isolation in a ring 0.1< DeltaR < D (0.2 or 0.3), above 3 sigma of total noise, available from rel 14.0.0:
 
Et (Cone 0.20) trkmatch->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) trkmatch->parameter(egammaParameters::etconoise30)
Line: 175 to 252
 
Et (Cone 0.40) p_EMShower->parameter(egammaParameters::etcone40)
Changed:
<
<
  • Et isolation in a ring 0.1<DeltaR< D (0.2 or 0.3), above 3 sigma of total noise, available from rel 14.0.0:
>
>
  • Et isolation in a ring 0.1<!DeltaR< D (0.2 or 0.3), above 3 sigma of total noise, available from rel 14.0.0:
 
Et (Cone 0.20) p_EMShower->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) p_EMShower->parameter(egammaParameters::etconoise30)
Line: 186 to 263
 const EMShower* p_EMShower = (*photItr)->detail(_egDetailContainerName);
Changed:
<
<

Muon objects:

>
>

Muon objects

 There are a variety of muon identification algorithms which led to two different muon containers: Stacomuons and Muidmuons.
Changed:
<
<
The Stacomuons (from StacoMuonCollection) are muon candidates found by combining the information from the Inner Detector (ID) and MuonSpectrometer (MS) at the Interaction Point (IP). The packages involved are:
>
>
The Stacomuons (from StacoMuonCollection) are muon candidates found by combining the information from the Inner Detector (ID) and MuonSpectrometer (MS) at the Interaction Point (IP). The packages involved are:
 
  • Muonboy which is a muon spectrometer "standalone" track reconstruction code.
Changed:
<
<
  • MuTag: is an algorithm to tag low Pt muons (starts from the ID tracks at the IP).
>
>
  • MuTag: is an algorithm to tag low Pt muons (starts from the ID tracks at the IP).
 
  • STACO for STAtistical COmbination.
Changed:
<
<
MuidMuons (from MuidMuonCollection) are muon candidates found by global re-fit of the hits from the ID and the MS. These muons are found the following packages:
>
>
MuidMuons (from MuidMuonCollection) are muon candidates found by global re-fit of the hits from the ID and the MS. These muons are found the following packages:
 
Changed:
<
<
  • MOORE (Muon Object Oriented REconstruction): A track fit is performed on the collection of hits recorded and a separate package (MuIDStandalone) is used to provide back propagation of the MOORE track through the calorimeter to the IP.
  • MuGirl (similar to MuTag): it associates an inner detector track to muon spectrometer. It uses pattern recognition algorithm based on Hough transforms and incorporates reasonable assumptions about MDT low level performance.
>
>
  • MOORE (Muon Object Oriented REconstruction): A track fit is performed on the collection of hits recorded and a separate package (MuIDStandalone) is used to provide back propagation of the MOORE track through the calorimeter to the IP.
  • MuGirl (similar to MuTag): it associates an inner detector track to muon spectrometer. It uses pattern recognition algorithm based on Hough transforms and incorporates reasonable assumptions about MDT low level performance.
 
  • MUIDCombined: this algorithm performs a global fit of all hits associated to tracks, unlike STACO which statistically merges the two independently found tracks.
Line: 235 to 312
 
Et (Cone 0.40) (*muonItr)->parameter(egammaParameters::etcone40)
Changed:
<
<
  • Et isolation in a ring DeltaR < D (0.1 or 0.2 or 0.3 or 0.4), above 3 sigma of total noise:
>
>
  • Et isolation in a ring DeltaR < D (0.1 or 0.2 or 0.3 or 0.4), above 3 sigma of total noise:
 
Et (Cone 0.10) (*muonItr)->parameter(egammaParameters::etconoise10)
Et (Cone 0.20) (*muonItr)->parameter(egammaParameters::etconoise20)
Line: 247 to 324
 Muons are divided into three categories:
  • Combined muons: isCombinedMuon()
  • Standalone muons: isStandAloneMuon()
Changed:
<
<
  • LowPt muons: isLowPtReconstructedMuon()
>
>
  • LowPt muons: isLowPtReconstructedMuon()
 
Changed:
<
<

Jet objects:

In general there are two main jet algorithms: Cone and Kt algorithms. These two algorithms have completely different ways of associating CaloClusers to jets. Jets are formed by nearby objects, where nearby refers to a distance. This distance can be either angular delta_R=sqrt(delta_eta**2 + delta_phi**2) for Cone algorithms or the relative transverse
>
>

Jet objects

In general there are two main jet algorithms: Cone and Kt algorithms. These two algorithms have completely different ways of associating CaloClusers to jets. Jets are formed by nearby objects, where nearby refers to a distance. This distance can be either angular delta_R=sqrt(delta_eta**2 + delta_phi**2) for Cone algorithms or the relative transverse
 momentum K_T for the Kt algorithms. The Cone algorithm successively merge pairs of nearby objects within a cone size of delta_R in order of decreasing pt. To avoid double counting of energy a merging/splitting method is employed. But still the Cone algorithms are neither infrared nor collinear safe.

The Kt Algorithm successively merge pairs of nearby objects in order of increasing relative transverse momentum. A single parameter "D" determines when this merging stops

Changed:
<
<
("D" characterizes the size of the resulting jets). There are two modes of the Kt algorithms: inclusive mode and exculsive mode.
>
>
("D" characterizes the size of the resulting jets). There are two modes of the Kt algorithms: inclusive mode and exclusive mode.
 
const JetCollection* JetList ;
Line: 271 to 348
 JetCollection::const_iterator jetItrE = _JetList->end() ;
Changed:
<
<

Tau objects:

>
>

Tau objects

 
// --- Retrieve taujet list ---
Line: 285 to 362
 
  • Algorithm 1: tauRec for which one can extract the following information:
Variable Implementation
Changed:
<
<
(*tauItr)->parameter(TauJetParameters::logLikelihoodRatio)
(*tauItr)->parameter(TauJetParameters::lowPtTauJetDiscriminant)
(*tauItr)->parameter(TauJetParameters::lowPtTauEleDiscriminant)
(*tauItr)->parameter(TauJetParameters::tauJetNeuralnetwork)
(*tauItr)->parameter(TauJetParameters::tauENeuralNetwork)
>
>
(*tauItr)->parameter(TauJetParameters::logLikelihoodRatio)
(*tauItr)->parameter(TauJetParameters::lowPtTauJetDiscriminant)
(*tauItr)->parameter(TauJetParameters::lowPtTauEleDiscriminant)
(*tauItr)->parameter(TauJetParameters::tauJetNeuralnetwork)
(*tauItr)->parameter(TauJetParameters::tauENeuralNetwork)
 
  • Algorithm 2: tau1P3P
Variable Implementation
Changed:
<
<
(*tauItr)->parameter(TauJetParameters::annularIsolationFraction)
(*tauItr)->parameter(TauJetParameters::etCaloAtEMScale)
(*tauItr)->parameter(TauJetParameters::etChargedHadCells )
(*tauItr)->parameter(TauJetParameters::etOtherEMCells )
(*tauItr)->parameter(TauJetParameters::etOtherHadCells)
(*tauItr)->parameter(TauJetParameters::discriminant )
>
>
(*tauItr)->parameter(TauJetParameters::annularIsolationFraction)
(*tauItr)->parameter(TauJetParameters::etCaloAtEMScale)
(*tauItr)->parameter(TauJetParameters::etChargedHadCells )
(*tauItr)->parameter(TauJetParameters::etOtherEMCells )
(*tauItr)->parameter(TauJetParameters::etOtherHadCells)
(*tauItr)->parameter(TauJetParameters::discriminant )
 

The alogorithms are accessed through the following method:

Line: 307 to 384
 author = (*tauItr)->author(); // can be either TauJetParameters::tauRec or TauJetParameters::tau1P3P
Changed:
<
<

MissingEt objects:

>
>

MissingEt objects

To get information about missing ET measurement, you need to extract the MissingET object with the key: "MET_RefFinal" at the ctor level:
 
Added:
>
>
declareProperty("MissingEtObject_RefFinal", _missingEtObjectNameRefFinal = "MET_RefFinal");
 
Added:
>
>
And then, in the execute() method, add the following lines:

// --- Retrieve missing ET RefFinal object ---
sc = _storeGate->retrieve(_etMissRefFinal,_missingEtObjectNameRefFinal);
MissingEt = _etMissRefFinal->sumet();
 

Modify the Z_Analysis package, Display and Analyse the Results (Histograms and Ntuples using Root)

Changed:
<
<

Extract the PID Efficiency:

>
>

Modify the code

 The actual code loops over objects in the electron container and make a combination between two objects. The invariant mass of the two objects is then stored in the root file.
Changed:
<
<
Your assignament is to include more variables into the tree, try to clean the invariant mass plot using a combination of cuts. The files to modify are:
>
>
Your assignment is to include more variables into the tree, try to clean the invariant mass plot using a combination of cuts. The files to modify are:
 
  • Z_Analysis/src/Z_Analysis.cxx
  • Z_Analysis/Z_Analysis/Z_Analysis.h
Added:
>
>

Extract the PID Efficiency

 

Session 4: Running jobs on the grid with GANGA and pathena/PANDA

Note: following this session will require you to have a valid grid certificate.
Line: 327 to 416
 rather than copying the data files locally which is impractical due to the size of AODs. It should be kept in mind that at the nominal luminosity, tens or hundreds of terabytes of AOD files will be produced in a single year. There are two ways of sending jobs to the grid: via GANGA and pathena (also called PANDA). Roughly speaking, GANGA is
Changed:
<
<
the framework for the European part of the grid while PANDA is its American counterpart. Ganga and PANDA are independent framworks. However, Europe-based users can still send jobs to PANDA and in fact sometimes this may be the preferred option, for example in case the required
>
>
the framework for the European part of the grid while PANDA is its American counterpart. Ganga and PANDA are independent frameworks. However, Europe based users can still send jobs to PANDA and in fact sometimes this may be the preferred option, for example in case the required
 datasets are available at BNL and CERN only.

We start by discussing GANGA.

Revision 162008-12-07 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 61 to 61
 Then, run it:
 ./checkFile.py AOD.029110._00001.pool.root.1 
Added:
>
>

Start Running a simple job (Z(ee) reconstruction):

Now, that we know the objects of interests, we will use the Z -->

ee MC sample and reconstruct the resonance invariant mass. You can copy the following code ~chaouki/scratch0/IctpTutorial/14.2.21/Z_Analysis to your area $HOME/scratch0/IctpTutorial/14.2.21. Then follow the following steps to configure and compile the code:
  • In Z_Analysis/cmt, do the following:
    cmt config
    source setup.sh
    make
       
  • In Z_Analysis/run, do the following:
    athena jobOptions_Z_Analysis.py
       
The output would be a root file which contains a tree with one leaf representing the Z invariant mass. thumbs up

Session 2: Basic Tools used in a Physics Analysis (1)

Monte-Carlo Information:

Trigger:

Session 3: Basic Tools Used in a Physics Analysis (2)

Basic Objects (Outputs of the reconstruction):

 
Deleted:
<
<

Session 2: Basic Objects used in a Physics Analysis

 At the AOD level, the user will be able to access the basic information about each object needed for physics analysis. Below, we show code snippets which would help you understand the process of retrieving the object's properties/reconstruction variables. In general, for each object we can access the kinematic information namely (charge is available for all charged objects):
Variable Implementation
Line: 82 to 107
 

then retrieve different basic objects, which are defined below:

Changed:
<
<

egamma objects:

>
>

egamma objects:

 These are objects identified in the Liquid Argon calorimeter. They are a mixture of electrons and photons. One can distinguish between the two species using the information that the inner detector would provide namely track-shower matching information. Therefore and based on this information, egamma objects are further separated into photon and electron collections.

  • electrons: the electron objects, which are stored in the ElectronContainer, satisfy very loose criteria and can be accessed in the following way.
Line: 161 to 186
 const EMShower* p_EMShower = (*photItr)->detail(_egDetailContainerName);
Changed:
<
<

Muon objects:

>
>

Muon objects:

 There are a variety of muon identification algorithms which led to two different muon containers: Stacomuons and Muidmuons.

The Stacomuons (from StacoMuonCollection) are muon candidates found by combining the information from the Inner Detector (ID) and MuonSpectrometer (MS) at the Interaction Point (IP). The packages involved are:

Line: 224 to 249
 
  • Standalone muons: isStandAloneMuon()
  • LowPt muons: isLowPtReconstructedMuon()
Changed:
<
<

Jet objects:

>
>

Jet objects:

 In general there are two main jet algorithms: Cone and Kt algorithms. These two algorithms have completely different ways of associating CaloClusers to jets. Jets are formed by nearby objects, where nearby refers to a distance. This distance can be either angular delta_R=sqrt(delta_eta**2 + delta_phi**2) for Cone algorithms or the relative transverse momentum K_T for the Kt algorithms. The Cone algorithm successively merge pairs of nearby objects within a cone size of delta_R in order of decreasing pt. To avoid double counting of energy a merging/splitting
Line: 246 to 271
 JetCollection::const_iterator jetItrE = _JetList->end() ;
Changed:
<
<

Tau objects:

>
>

Tau objects:

 
Added:
>
>
// --- Retrieve taujet list --- const Analysis::TauJetContainer *_tauList ; sc=_storeGate->retrieve( _tauList, _taujetContainerName);
 TauJetContainer::const_iterator tauItr = _tauList->begin(); TauJetContainer::const_iterator tauItrE = _tauList->end();
Changed:
<
<
For Taus, there are different reconstruction algorithms, where each algorithm has its own parameters which can be used to identify taus. For example we have:
>
>
For Taus, there are different reconstruction algorithms, where each algorithm has its own tau-identification parameters, for example we have:
 
Changed:
<
<
  • Algorithm 1: tauRec
>
>
  • Algorithm 1: tauRec for which one can extract the following information:
Variable Implementation
(*tauItr)->parameter(TauJetParameters::logLikelihoodRatio)
(*tauItr)->parameter(TauJetParameters::lowPtTauJetDiscriminant)
(*tauItr)->parameter(TauJetParameters::lowPtTauEleDiscriminant)
(*tauItr)->parameter(TauJetParameters::tauJetNeuralnetwork)
(*tauItr)->parameter(TauJetParameters::tauENeuralNetwork)
 
  • Algorithm 2: tau1P3P
Added:
>
>
Variable Implementation
(*tauItr)->parameter(TauJetParameters::annularIsolationFraction)
(*tauItr)->parameter(TauJetParameters::etCaloAtEMScale)
(*tauItr)->parameter(TauJetParameters::etChargedHadCells )
(*tauItr)->parameter(TauJetParameters::etOtherEMCells )
(*tauItr)->parameter(TauJetParameters::etOtherHadCells)
(*tauItr)->parameter(TauJetParameters::discriminant )
 
Added:
>
>
The alogorithms are accessed through the following method:
 
Deleted:
<
<
ntrk = (*tauItr)->numTrack();
 author = (*tauItr)->author(); // can be either TauJetParameters::tauRec or TauJetParameters::tau1P3P
Deleted:
<
<
//--/ Additional tau-identification parameters that are algorithm-dependent if( (*tauItr)->author() == TauJetParameters::tauRec){ // tauRec tauRec_logLikelihoodRatio = (*tauItr)->parameter(TauJetParameters::logLikelihoodRatio) ; tauRec_lowPtTauJetDiscriminant = (*tauItr)->parameter(TauJetParameters::lowPtTauJetDiscriminant) ; tauRec_lowPtTauEleDiscriminant = (*tauItr)->parameter(TauJetParameters::lowPtTauEleDiscriminant) ; tauRec_tauJetNeuralnetwork = (*tauItr)->parameter(TauJetParameters::tauJetNeuralnetwork) ; tauRec_tauENeuralNetwork = (*tauItr)->parameter(TauJetParameters::tauENeuralNetwork) ; } if( (*tauItr)->author() == TauJetParameters::tau1P3P){ // Tau1P3P tau1p3p_annularIsolationFraction = (*tauItr)->parameter(TauJetParameters::annularIsolationFraction) ; tau1p3p_etCaloAtEMScale = (*tauItr)->parameter(TauJetParameters::etCaloAtEMScale) ; tau1p3p_etChargedHadCells = (*tauItr)->parameter(TauJetParameters::etChargedHadCells ) ; tau1p3p_etOtherEMCells = (*tauItr)->parameter(TauJetParameters::etOtherEMCells ) ; tau1p3p_etOtherHadCells = (*tauItr)->parameter(TauJetParameters::etOtherHadCells) ; tau1p3p_discriminant = (*tauItr)->parameter(TauJetParameters::discriminant ) ; }
 
Changed:
<
<

Session 3: Run a Job, Display and Analyse the Results (Histograms and Ntuples using Root)

Start Running a simple job (Z(ee) reconstruction):

>
>

MissingEt objects:

 
Deleted:
<
<
Now, that we know the objects of interests, we will use the Z -->
ee MC sample and reconstruct the resonance invariant mass. You can copy the following code ~chaouki/scratch0/IctpTutorial/14.2.21/Z_Analysis to your area $HOME/scratch0/IctpTutorial/14.2.21. Then follow the following steps to configure and compile the code:
  • In Z_Analysis/cmt, do the following:
    cmt config
    source setup.sh
    make
       
  • In Z_Analysis/run, do the following:
    athena jobOptions_Z_Analysis.py
       
The output would be a root file which contains a tree with one leaf representing the Z invariant mass. thumbs up
 
Changed:
<
<

Modify the source code:

>
>

Modify the Z_Analysis package, Display and Analyse the Results (Histograms and Ntuples using Root)

Extract the PID Efficiency:

 The actual code loops over objects in the electron container and make a combination between two objects. The invariant mass of the two objects is then stored in the root file. Your assignament is to include more variables into the tree, try to clean the invariant mass plot using a combination of cuts. The files to modify are:
  • Z_Analysis/src/Z_Analysis.cxx

Revision 152008-12-06 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 76 to 76
 
charge (*Itr)->charge()
Changed:
<
<
The basic objects are:
>
>
And before retrieving the different objects, you need to get a handle on the store gate (_storeGate):
StatusCode sc = service("StoreGateSvc", _storeGate);

then retrieve different basic objects, which are defined below:

 

egamma objects:

These are objects identified in the Liquid Argon calorimeter. They are a mixture of electrons and photons. One can distinguish between the two species using the information that the inner detector would provide namely track-shower matching information. Therefore and based on this information, egamma objects are further separated into photon and electron collections.

  • electrons: the electron objects, which are stored in the ElectronContainer, satisfy very loose criteria and can be accessed in the following way.
Added:
>
>
const ElectronContainer *_elecList ; sc=_storeGate->retrieve( _elecList, _electronContainerName);
 ElectronContainer::const_iterator elecItr = _elecList->begin(); ElectronContainer::const_iterator elecItrE = _elecList->end();
Line: 118 to 125
 
  • photons: the photon objects, which are stored in the Photon Container, satisfy very loose criteria and can be accessed in the following way.
Added:
>
>
const PhotonContainer *_photonList ; sc=_storeGate->retrieve( _photonList, _photonContainerName);
 PhotonContainer::const_iterator photItr = _photonList->begin(); PhotonContainer::const_iterator photItrE = _photonList->end();
Line: 167 to 177
 
  • MUIDCombined: this algorithm performs a global fit of all hits associated to tracks, unlike STACO which statistically merges the two independently found tracks.
Added:
>
>
const Analysis::MuonContainer *_muidMuonList ; sc=_storeGate->retrieve( _muidMuonList, _muidMuonContainerName);
 MuonContainer::const_iterator muonItr = _muidMuonList->begin(); MuonContainer::const_iterator muonItrE = _muidMuonList->end(); OR
Added:
>
>
const Analysis::MuonContainer *_stacoMuonList ; sc=_storeGate->retrieve( _stacoMuonList, _stacoMuonContainerName);
 MuonContainer::const_iterator muonItr = _stacoMuonList->begin(); MuonContainer::const_iterator muonItrE = _stacoMuonList->end();
Line: 220 to 234
 ("D" characterizes the size of the resulting jets). There are two modes of the Kt algorithms: inclusive mode and exculsive mode.
Changed:
<
<
JetCollection::const_iterator jetItr = currentJetList->begin() ; JetCollection::const_iterator jetItrE = currentJetList->end() ;
>
>
const JetCollection* JetList ; // --- Retrieve the cone jet list --- sc=_storeGate->retrieve( _jetList, _jetContainerName); // --- OR Retrieve the cone4 jet list --- sc=_storeGate->retrieve( _jetList, _jet4ContainerName); // --- OR Retrieve the kt jet list --- sc=_storeGate->retrieve( _jetList, _jetkContainerName);

JetCollection::const_iterator jetItr = _JetList->begin() ; JetCollection::const_iterator jetItrE = _JetList->end() ;

 

Tau objects:

Revision 142008-12-03 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 10 to 10
 We will cover the following main areas:
  • Setup the directory structure.
  • Setup a particular version of the ATLAS software releases.
Deleted:
<
<
  • A brief explanation of the process to generate particular MC samples.
 
  • Understand various physics analysis objects (how to access more information about each object).
  • Use the physics objects in an AOD analysis.
  • Use the GRID (PAnda, Ganga) to generate MC samples, analyze data, etc...
Line: 56 to 55
 
  • dq2-get: used to retrieve the data over the GRID and store it locally.
For more information about the DQ2 tools, you can look at the following page: Atlas DDM page
Changed:
<
<
Let's look for a particular dataset
>
>
Let's look for a particular dataset: valid1.005144.PythiaZee.recon.AOD.e322_s412_r583, this is a validation dataset. Now, to retrieve what containers are stored in the dataset, one can use the python file "checkFile.py" which can be obtained using get_file:
 get_files checkFile.py  
. Then, run it:
 ./checkFile.py AOD.029110._00001.pool.root.1 
 

Session 2: Basic Objects used in a Physics Analysis

At the AOD level, the user will be able to access the basic information about each object needed for physics analysis. Below, we show code snippets which would help you understand the process of retrieving the object's properties/reconstruction variables. In general, for each object we can access the kinematic information namely (charge is available for all charged objects):
Line: 258 to 257
 

Session 3: Run a Job, Display and Analyse the Results (Histograms and Ntuples using Root)

Changed:
<
<

Start Running a simple job:

>
>

Start Running a simple job (Z(ee) reconstruction):

 
Added:
>
>
Now, that we know the objects of interests, we will use the Z -->
ee MC sample and reconstruct the resonance invariant mass. You can copy the following code ~chaouki/scratch0/IctpTutorial/14.2.21/Z_Analysis to your area $HOME/scratch0/IctpTutorial/14.2.21. Then follow the following steps to configure and compile the code:
  • In Z_Analysis/cmt, do the following:
    cmt config
    source setup.sh
    make
       
  • In Z_Analysis/run, do the following:
    athena jobOptions_Z_Analysis.py
       
The output would be a root file which contains a tree with one leaf representing the Z invariant mass.
 thumbs up
Added:
>
>

Modify the source code:

The actual code loops over objects in the electron container and make a combination between two objects. The invariant mass of the two objects is then stored in the root file. Your assignament is to include more variables into the tree, try to clean the invariant mass plot using a combination of cuts. The files to modify are:
  • Z_Analysis/src/Z_Analysis.cxx
  • Z_Analysis/Z_Analysis/Z_Analysis.h
 

Session 4: Running jobs on the grid with GANGA and pathena/PANDA

Note: following this session will require you to have a valid grid certificate.

Revision 132008-12-01 - KerimSuruliz

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 262 to 262
  thumbs up
Changed:
<
<

Session 4:

>
>

Session 4: Running jobs on the grid with GANGA and pathena/PANDA

Note: following this session will require you to have a valid grid certificate.

Having successfully tested the analysis code on a local AOD file, the code may be run on large datasets on the grid. The ATLAS policy is that jobs should be run on the data over the grid so that the job goes to the data where it's stored, rather than copying the data files locally which is impractical due to the size of AODs. It should be kept in mind that at the nominal luminosity, tens or hundreds of terabytes of AOD files will be produced in a single year. There are two ways of sending jobs to the grid: via GANGA and pathena (also called PANDA). Roughly speaking, GANGA is the framework for the European part of the grid while PANDA is its American counterpart. Ganga and PANDA are independent framworks. However, Europe-based users can still send jobs to PANDA and in fact sometimes this may be the preferred option, for example in case the required datasets are available at BNL and CERN only.

We start by discussing GANGA.

Running jobs on the grid with Ganga

First set up Athena in the usual way. Then from the run directory of UserAnalysis type:

source /afs/cern.ch/sw/ganga/install/etc/setup-atlas.sh  (or .csh)
ganga

Ganga is now set up and running - it can be exited with Ctrl-D. In order to run our job we have to write a small script which tells Ganga which job options to use, which dataset to run on, how many subjobs to split the job into and which site to execute on. A simple example of a Ganga script is:

j = Job()
j.name='MyGridAnalysis'
j.application=Athena()
j.application.exclude_from_user_area=["*.o","*.root*","*.exe"]
j.application.prepare(athena_compile=False)
j.application.atlas_release='14.2.21'
j.application.option_file='$HOME/testarea/14.2.21/PhysicsAnalysis/AnalysisCommon/UserAnalysis/share/AnalysisSkeleton_topOptions_localAOD.py'
j.application.max_events='1000000' 
j.splitter=AthenaSplitterJob()
j.splitter.numsubjobs=20
j.inputdata=DQ2Dataset()
j.inputdata.dataset="mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r541"
j.inputdata.type='DQ2_LOCAL'

j.outputdata=DQ2OutputDataset()
j.outputdata.outputdata=['AnalysisSkeleton.aan.root']
j.outputdata.datasetname='Zjets_v14_test'

j.backend=LCG()
j.backend.requirements=AtlasLCGRequirements()
j.backend.requirements.sites= ['NAPOLI']
j.submit()
The important parts are:
j.application.max_events
which specifies how many events to run on,
j.splitter.numsubjobs
which indicates how many subjobs the job is split into - remember that running over 100.000 events takes a very long time on a single grid node, and it is therefore preferable to split the job into (for example) 20 subjobs each of which will then run over 5000 events. The lines
j.outputdata=DQ2OutputDataset()
j.outputdata.outputdata=['AnalysisSkeleton.aan.root']
j.outputdata.datasetname='Zjets_v14_test'
tell ganga to save the output in a dq2 dataset, which will be called
user08.<yourUserName>.Zjets_v14_test
Lastly, we specify which Tier-2 site to run on in the line:
j.backend.requirements.sites= ['NAPOLI']

Now copy/paste this script into your favourite editor and save the file as

gangascript.py
in the
run
subdirectory of
UserAnalysis
. Then run ganga as before and once it's started up, type:
execfile('gangascript.py')

You can now type

jobs
and there will be one job, whose status will eventually change from running to completed. In case of problems, the job will fail. This may be due to a number of reasons, the most frequent being buggy code or the grid site not having been set up properly, in which case you should try sending the job to a different location.

Once the job completes successfully, we can retrieve the output by setting up dq2 and issuing the following command:

dq2-get -H /tmp/<yourUserName> user08.<yourUserName>.Zjets_v14_test

dq2 will now retrieve the

AnalysisSkeleton.aan.root
files from each individual subjob and and put them in the local /tmp/ directory. The log files from the run, including the athena log files for each separate subjob, may be found in the directory
$HOME/UserName/Local/<job number>/output/
.

More information on GANGA can be found at https://twiki.cern.ch/twiki/bin/view/Atlas/GangaTutorial5.

Running jobs on the grid with pathena/PANDA

First set up Athena as usual. Then type:
cd ${HOME}/scratch0/IctpTutorial/14.2.21
cmt co PhysicsAnalysis/DistributedAnalysis/PandaTools
cd PhysicsAnalysis/DistributedAnalysis/PandaTools/cmt
cmt config
source setup.sh
make
rehash  # (if you are using zsh/csh/tcsh)

pathena should be set up now. Typing

pathena
should result in:
ERROR : no outDS
   pathena [--inDS input] --outDS output myJobO.py

Every time you log in and set up Athena from then on, the pathena command should be recognised automatically.

Now, to run a job with pathena, we run the command:

pathena --split 20 --inDS mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r541 --outDS user08.<yourUserName>.ttbar_v14_pathenatest AnalysisSkeleton_topOptions_localAOD.py

By default, jobs are sent to the BNL site, ANALY_BNL_ATLAS_1. This can be changed by adding a

--site==SITE
option. The number of subjobs is controlled by the
--split
option. To specify the number of events each subjob runs on, use the
--nFilesPerJob
option.

There are two ways to monitor the progress of a PANDA jobs. One is to use the pandamonitor website, http://gridui02.usatlas.bnl.gov:25880/server/pandamon/query. The other is to run

pathena_util
Then issuing the command
show()
gives a list of jobs, which nodes they were assigned to, their JobID and other useful information. Now the command
status()
may be used to see the status of an individual job.

Conveniently, PANDA sends an email out to your CERN address once the job execution is finished, with the status (completed/failed) and the names of input and output datasets.

More help on PANDA is available at http://gridui02.usatlas.bnl.gov:25880/server/pandamon/query.

  -- ChaoukiB - 24 Nov 2008

Revision 122008-11-29 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 139 to 139
 
Et (Cone 0.20) p_EMShower->parameter(egammaParameters::etcone20)
Et (Cone 0.30) p_EMShower->parameter(egammaParameters::etcone30)
Et (Cone 0.40) p_EMShower->parameter(egammaParameters::etcone40)
Deleted:
<
<
Et (Cone 0.20) p_EMShower->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) p_EMShower->parameter(egammaParameters::etconoise30)
 

  • Et isolation in a ring 0.1<DeltaR< D (0.2 or 0.3), above 3 sigma of total noise, available from rel 14.0.0:
Line: 186 to 184
 fChi2 = (*muonItr)->fitChi2(); mChi2 = (*muonItr)->matchChi2() ; bestM = (*muonItr)->bestMatch();
Deleted:
<
<
coneiso20 = (*muonItr)->parameter(MuonParameters::nucone20) ; coneiso30 = (*muonItr)->parameter(MuonParameters::nucone30) ; coneiso40 = (*muonItr)->parameter(MuonParameters::nucone40) ; etiso20 = (*muonItr)->parameter(MuonParameters::etcone20) ; etiso30 = (*muonItr)->parameter(MuonParameters::etcone30) ; etiso40 = (*muonItr)->parameter(MuonParameters::etcone40) ; etiso10 = (*muonItr)->parameter(MuonParameters::etcone10) ; coneiso10 = (*muonItr)->parameter(MuonParameters::nucone10) ;
 
Added:
>
>
Where one can get the following information from the EM:

  • Et isolation:
Variable Implementation
Et (Cone 0.10) (*muonItr)->parameter(egammaParameters::etcone10)
Et (Cone 0.20) (*muonItr)->parameter(egammaParameters::etcone20)
Et (Cone 0.30) (*muonItr)->parameter(egammaParameters::etcone30)
Et (Cone 0.40) (*muonItr)->parameter(egammaParameters::etcone40)

  • Et isolation in a ring DeltaR < D (0.1 or 0.2 or 0.3 or 0.4), above 3 sigma of total noise:
Et (Cone 0.10) (*muonItr)->parameter(egammaParameters::etconoise10)
Et (Cone 0.20) (*muonItr)->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) (*muonItr)->parameter(egammaParameters::etconoise30)
Et (Cone 0.40) (*muonItr)->parameter(egammaParameters::etconoise40)
 Muons are divided into three categories:
  • Combined muons: isCombinedMuon()
  • Standalone muons: isStandAloneMuon()
Line: 221 to 231
 TauJetContainer::const_iterator tauItr = _tauList->begin(); TauJetContainer::const_iterator tauItrE = _tauList->end();
Changed:
<
<
where one can extract all possible information for example:
>
>
For Taus, there are different reconstruction algorithms, where each algorithm has its own parameters which can be used to identify taus. For example we have:

  • Algorithm 1: tauRec
  • Algorithm 2: tau1P3P
 
ntrk   = (*tauItr)->numTrack();
author = (*tauItr)->author(); // can be either TauJetParameters::tauRec or TauJetParameters::tau1P3P

Revision 112008-11-29 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 92 to 92
 piWeight = (*elecItr)->egammaID(egammaPID::BgWeight) ;
Changed:
<
<
The EM/Track matching information is given as follows:
>
>
From the EM/Track matching, one can access the following information:
 
Changed:
<
<
E/p trkmatch->parameter(egammaParameters::EoverP)
const EMTrackMatch* trkmatch = (*elecItr)->detail<EMTrackMatch>(_trkMatchContainerName);
>
>
  • E/p and Et isolation:
Variable Implementation
E/p trkmatch->parameter(egammaParameters::EoverP)
Et (Cone 0.45) trkmatch->parameter(egammaParameters::etcone)
Et (Cone 0.20) trkmatch->parameter(egammaParameters::etcone20)
Et (Cone 0.30) trkmatch->parameter(egammaParameters::etcone30)
Et (Cone 0.40) trkmatch->parameter(egammaParameters::etcone40)
Et (Cone 0.20) trkmatch->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) trkmatch->parameter(egammaParameters::etconoise30)
 
Changed:
<
<
eoverp = trkmatch->parameter(egammaParameters::EoverP) ;
>
>
  • Et isolation in a ring 0.1< DeltaR < D (0.2 or 0.3), above 3 sigma of total noise, available from rel 14.0.0:
Et (Cone 0.20) trkmatch->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) trkmatch->parameter(egammaParameters::etconoise30)
 
Changed:
<
<
etcone = trkmatch->parameter(egammaParameters::etcone) ; etcone20 = trkmatch->parameter(egammaParameters::etcone20) ; etcone30 = trkmatch->parameter(egammaParameters::etcone30) ; etcone40 = trkmatch->parameter(egammaParameters::etcone40) ; etconoise20 = trkmatch->parameter(egammaParameters::etconoise20) ; etconoise30 = trkmatch->parameter(egammaParameters::etconoise30) ;
>
>
where trkmatch is obtained as follows:
const EMTrackMatch* trkmatch = (*elecItr)->detail<EMTrackMatch>(_trkMatchContainerName);
 

  • photons: the photon objects, which are stored in the Photon Container, satisfy very loose criteria and can be accessed in the following way.
Line: 121 to 130
 piwgt = (*photItr)->egammaID(egammaPID::BgWeight) ;
Added:
>
>
From the EM, one can access the following information:

  • E/p and Et isolation:
Variable Implementation
Et (Cone 0.45) p_EMShower->parameter(egammaParameters::etcone)
Et (Cone 0.20) p_EMShower->parameter(egammaParameters::etcone20)
Et (Cone 0.30) p_EMShower->parameter(egammaParameters::etcone30)
Et (Cone 0.40) p_EMShower->parameter(egammaParameters::etcone40)
Et (Cone 0.20) p_EMShower->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) p_EMShower->parameter(egammaParameters::etconoise30)

  • Et isolation in a ring 0.1<DeltaR< D (0.2 or 0.3), above 3 sigma of total noise, available from rel 14.0.0:
Et (Cone 0.20) p_EMShower->parameter(egammaParameters::etconoise20)
Et (Cone 0.30) p_EMShower->parameter(egammaParameters::etconoise30)

where p_EMShower is obtained from:

 
const EMShower* p_EMShower = (*photItr)->detail<EMShower>(_egDetailContainerName);
Deleted:
<
<
/* Isolation energy (transverse) in a cone with half-opening angle 0.45 */ etcone = p_EMShower->parameter(egammaParameters::etcone) ; etcone20 = p_EMShower->parameter(egammaParameters::etcone20) ; etcone30 = p_EMShower->parameter(egammaParameters::etcone30) ; etcone40 = p_EMShower->parameter(egammaParameters::etcone40) ; /* isolation transverse energy in a ring 0.1<Dr<0.2, above 3 sigma of total noise, available from rel 14.0.0 */ etconoise20 = p_EMShower->parameter(egammaParameters::etconoise20) ; etconoise30 = p_EMShower->parameter(egammaParameters::etconoise30) ;
 

Muon objects:

Revision 102008-11-28 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 59 to 59
 Let's look for a particular dataset

Session 2: Basic Objects used in a Physics Analysis

Changed:
<
<
At the AOD level, the user will be able to access the basic information about each object needed for physics analysis. Below, we show code snippets which would help you understand the process of retrieving the object's properties/reconstruction variables. The basic objects are:
>
>
At the AOD level, the user will be able to access the basic information about each object needed for physics analysis. Below, we show code snippets which would help you understand the process of retrieving the object's properties/reconstruction variables. In general, for each object we can access the kinematic information namely (charge is available for all charged objects):
Variable Implementation
px (*Itr)->hlv().x()
py (*Itr)->hlv().y()
pz (*Itr)->hlv().z()
energy (*Itr)->hlv().t()
pT (*Itr)->hlv().perp()
eta (*Itr)->hlv().eta()
phi (*Itr)->hlv().phi()
charge (*Itr)->charge()

The basic objects are:

 

egamma objects:

These are objects identified in the Liquid Argon calorimeter. They are a mixture of electrons and photons. One can distinguish between the two species using the information that the inner detector would provide namely track-shower matching information. Therefore and based on this information, egamma objects are further separated into photon and electron collections.
Line: 69 to 84
 ElectronContainer::const_iterator elecItrE = _elecList->end(); where one can extract all possible information for example:
Deleted:
<
<
px = (*elecItr)->hlv().x() ;
py = (*elecItr)->hlv().y() ;
pz = (*elecItr)->hlv().z() ;
e = (*elecItr)->hlv().t() ;
pt = (*elecItr)->hlv().perp() ;
eta = (*elecItr)->hlv().eta() ;
phi  = (*elecItr)->hlv().phi() ;
charge = (*elecItr)->charge() ;
 
Added:
>
>
 elecAuthor = (*elecItr)->author(); IsEM = (*elecItr)->isem() ; emWeight = (*elecItr)->egammaID(egammaPID::ElectronWeight) ;
Line: 108 to 115
  where one can extract all possible information for example:
Deleted:
<
<
px = (*photItr)->hlv().x() ; py = (*photItr)->hlv().y() ; pz = (*photItr)->hlv().z() ; e = (*photItr)->hlv().t() ; pt = (*photItr)->hlv().perp() ; eta = (*photItr)->hlv().eta() ; phi = (*photItr)->hlv().phi() ;
 author = (*photItr)->author() ; IsEM = (*photItr)->pid()->isEM() ; emwgt = (*photItr)->egammaID(egammaPID::ElectronWeight) ;
Line: 159 to 159
  where one can extract all possible information for example:
Deleted:
<
<
px = (*muonItr)->hlv().x() ; py = (*muonItr)->hlv().y() ; pz = (*muonItr)->hlv().z() ; e = (*muonItr)->hlv().t() ; pt = (*muonItr)->hlv().perp() ; eta = (*muonItr)->hlv().eta() ; phi = (*muonItr)->hlv().phi() ; charge = (*muonItr)->charge() ;
 author = (*muonItr)->author() ; fChi2OverDoF = (*muonItr)->fitChi2OverDoF() ; mChi2OverDoF = (*muonItr)->matchChi2OverDoF() ;
Line: 201 to 193
 JetCollection::const_iterator jetItr = currentJetList->begin() ; JetCollection::const_iterator jetItrE = currentJetList->end() ;
Deleted:
<
<
where one can extract all possible information for example:
px = (*jetItr)->hlv().x() ;
py = (*jetItr)->hlv().y() ;
pz = (*jetItr)->hlv().z() ;
e   = (*jetItr)->hlv().t() ;
pt  = (*jetItr)->hlv().perp() ;
eta = (*jetItr)->hlv().eta() ;
phi = (*jetItr)->hlv().phi() ;
 

Tau objects:

Line: 220 to 202
  where one can extract all possible information for example:
Deleted:
<
<
px = (*tauItr)->hlv().x() ; py = (*tauItr)->hlv().y() ; pz = (*tauItr)->hlv().z() ; e = (*tauItr)->hlv().t() ; pt = (*tauItr)->hlv().perp() ; eta = (*tauItr)->hlv().eta() ; phi = (*tauItr)->hlv().phi() ;
 ntrk = (*tauItr)->numTrack();
Deleted:
<
<
charge = (*tauItr)->charge() ;
 author = (*tauItr)->author(); // can be either TauJetParameters::tauRec or TauJetParameters::tau1P3P //--/ Additional tau-identification parameters that are algorithm-dependent if( (*tauItr)->author() == TauJetParameters::tauRec){ // tauRec

Revision 92008-11-28 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 86 to 86
 

The EM/Track matching information is given as follows:

Added:
>
>
E/p trkmatch->parameter(egammaParameters::EoverP)
 
const EMTrackMatch* trkmatch = (*elecItr)->detail<EMTrackMatch>(_trkMatchContainerName);
Line: 116 to 119
 IsEM = (*photItr)->pid()->isEM() ; emwgt = (*photItr)->egammaID(egammaPID::ElectronWeight) ; piwgt = (*photItr)->egammaID(egammaPID::BgWeight) ;
Added:
>
>
 const EMShower* p_EMShower = (*photItr)->detail(_egDetailContainerName); /* Isolation energy (transverse) in a cone with half-opening angle 0.45 */ etcone = p_EMShower->parameter(egammaParameters::etcone) ;

Revision 82008-11-28 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 65 to 63
 

egamma objects:

These are objects identified in the Liquid Argon calorimeter. They are a mixture of electrons and photons. One can distinguish between the two species using the information that the inner detector would provide namely track-shower matching information. Therefore and based on this information, egamma objects are further separated into photon and electron collections.
Changed:
<
<
  • electrons
>
>
  • electrons: the electron objects, which are stored in the ElectronContainer, satisfy very loose criteria and can be accessed in the following way.
 
ElectronContainer::const_iterator elecItr  = _elecList->begin();
ElectronContainer::const_iterator elecItrE = _elecList->end();
Changed:
<
<
for (; elecItr = elecItrE; ++elecItr) {
>
>
where one can extract all possible information for example:
  px = (*elecItr)->hlv().x() ; py = (*elecItr)->hlv().y() ; pz = (*elecItr)->hlv().z() ;
Line: 78 to 78
  eta = (*elecItr)->hlv().eta() ; phi = (*elecItr)->hlv().phi() ;
Deleted:
<
<
/** Electron Track and Electron Cluster */ const Rec::TrackParticle * track = (*elecItr)->trackParticle(); const CaloCluster * cluster = (*elecItr)->cluster(); const EMTrackMatch* trkmatch = (*elecItr)->detail(_trkMatchContainerName);
  charge = (*elecItr)->charge() ; elecAuthor = (*elecItr)->author(); IsEM = (*elecItr)->isem() ;
Added:
>
>
emWeight = (*elecItr)->egammaID(egammaPID::ElectronWeight) ; piWeight = (*elecItr)->egammaID(egammaPID::BgWeight) ;

The EM/Track matching information is given as follows:

const EMTrackMatch* trkmatch = (*elecItr)->detail<EMTrackMatch>(_trkMatchContainerName);
 
Deleted:
<
<
if ( trkmatch ) {
  eoverp = trkmatch->parameter(egammaParameters::EoverP) ; etcone = trkmatch->parameter(egammaParameters::etcone) ; etcone20 = trkmatch->parameter(egammaParameters::etcone20) ;
Line: 94 to 96
  etcone40 = trkmatch->parameter(egammaParameters::etcone40) ; etconoise20 = trkmatch->parameter(egammaParameters::etconoise20) ; etconoise30 = trkmatch->parameter(egammaParameters::etconoise30) ;
Deleted:
<
<
emWeight = (*elecItr)->egammaID(egammaPID::ElectronWeight) ; piWeight = (*elecItr)->egammaID(egammaPID::BgWeight) ; } if ( track ) { z0vtx = track->measuredPerigee()->localPosition()[Trk::z0] ; d0vtx = track->measuredPerigee()->localPosition()[Trk::d0] ; } }
 
Changed:
<
<
  • photons
>
>
  • photons: the photon objects, which are stored in the Photon Container, satisfy very loose criteria and can be accessed in the following way.
 
PhotonContainer::const_iterator photItr  = _photonList->begin();
PhotonContainer::const_iterator photItrE = _photonList->end();
Changed:
<
<
for (; photItr = photItrE; ++photItr) { const EMShower* p_EMShower = (*photItr)->detail(_egDetailContainerName);
>
>
where one can extract all possible information for example:
  px = (*photItr)->hlv().x() ; py = (*photItr)->hlv().y() ; pz = (*photItr)->hlv().z() ;
Line: 120 to 114
  phi = (*photItr)->hlv().phi() ; author = (*photItr)->author() ; IsEM = (*photItr)->pid()->isEM() ;
Changed:
<
<
etcone = p_EMShower->parameter(egammaParameters::etcone) ; // isolation energy (transverse) in a cone with half-opening angle 0.45
>
>
emwgt = (*photItr)->egammaID(egammaPID::ElectronWeight) ; piwgt = (*photItr)->egammaID(egammaPID::BgWeight) ; const EMShower* p_EMShower = (*photItr)->detail(_egDetailContainerName); /* Isolation energy (transverse) in a cone with half-opening angle 0.45 */ etcone = p_EMShower->parameter(egammaParameters::etcone) ;
  etcone20 = p_EMShower->parameter(egammaParameters::etcone20) ; etcone30 = p_EMShower->parameter(egammaParameters::etcone30) ; etcone40 = p_EMShower->parameter(egammaParameters::etcone40) ;
Changed:
<
<
etconoise20 = p_EMShower->parameter(egammaParameters::etconoise20) ; //isolation transverse energy in a ring 0.1<Dr<0.2, above 3 sigma of total noise, available from rel 14.0.0
>
>
/* isolation transverse energy in a ring 0.1<Dr<0.2, above 3 sigma of total noise, available from rel 14.0.0 */ etconoise20 = p_EMShower->parameter(egammaParameters::etconoise20) ;
  etconoise30 = p_EMShower->parameter(egammaParameters::etconoise30) ;
Deleted:
<
<
emwgt = (*photItr)->egammaID(egammaPID::ElectronWeight) ; piwgt = (*photItr)->egammaID(egammaPID::BgWeight) ; }
 

Muon objects:

Added:
>
>
There are a variety of muon identification algorithms which led to two different muon containers: Stacomuons and Muidmuons.

The Stacomuons (from StacoMuonCollection) are muon candidates found by combining the information from the Inner Detector (ID) and MuonSpectrometer (MS) at the Interaction Point (IP). The packages involved are:

  • Muonboy which is a muon spectrometer "standalone" track reconstruction code.
  • MuTag: is an algorithm to tag low Pt muons (starts from the ID tracks at the IP).
  • STACO for STAtistical COmbination.

MuidMuons (from MuidMuonCollection) are muon candidates found by global re-fit of the hits from the ID and the MS. These muons are found the following packages:

  • MOORE (Muon Object Oriented REconstruction): A track fit is performed on the collection of hits recorded and a separate package (MuIDStandalone) is used to provide back propagation of the MOORE track through the calorimeter to the IP.
  • MuGirl (similar to MuTag): it associates an inner detector track to muon spectrometer. It uses pattern recognition algorithm based on Hough transforms and incorporates reasonable assumptions about MDT low level performance.
  • MUIDCombined: this algorithm performs a global fit of all hits associated to tracks, unlike STACO which statistically merges the two independently found tracks.
 
MuonContainer::const_iterator muonItr  = _muidMuonList->begin();
MuonContainer::const_iterator muonItrE = _muidMuonList->end();
Added:
>
>
  OR
Added:
>
>
 MuonContainer::const_iterator muonItr = _stacoMuonList->begin(); MuonContainer::const_iterator muonItrE = _stacoMuonList->end();
Changed:
<
<
for (; muonItr = muonItrE; ++muonItr) {
>
>
where one can extract all possible information for example:
  px = (*muonItr)->hlv().x() ; py = (*muonItr)->hlv().y() ; pz = (*muonItr)->hlv().z() ;
Line: 147 to 160
  pt = (*muonItr)->hlv().perp() ; eta = (*muonItr)->hlv().eta() ; phi = (*muonItr)->hlv().phi() ;
Deleted:
<
<
/** Muon Author - hight or low Pt reconstruction algorithms */ double muonAuthor = -1.0; if ( (*muonItr)->isHighPt() && (*muonItr)->hasMuonExtrapolatedTrackParticle() ) muonAuthor = 0.0; else if ( (*muonItr)->isHighPt() && !(*muonItr)->hasMuonExtrapolatedTrackParticle() ) muonAuthor = 1.0; else if ( (*muonItr)->isLowPt() ) muonAuthor = 2.0; /** Muon Status - standalone, combined, lowPt */ double muonStatus = -1.0; if ( (*muonItr)->hasCombinedMuonTrackParticle() && !(*muonItr)->bestMatch() ) muonStatus = 0.0; // has a combined track but it is not the best matched else if ( (*muonItr)->isCombinedMuon() ) muonStatus = 1.0; // combined muons else if ( (*muonItr)->isStandAloneMuon() ) muonStatus = 2.0; // standalone muon else if ( (*muonItr)->isLowPtReconstructedMuon() ) muonStatus = 3.0; // lowPt reconsrtructed muon

/** Muon PoverP and access to the muon associated trackParticles You can also access the associated Inner TrackParticle in case of lowPt reconstructed or combined Track */ if ( (*muonItr)->hasCombinedMuonTrackParticle() ) { const Rec::TrackParticle * muonSpectrometerTrack = (*muonItr)->muonSpectrometerTrackParticle(); onst Rec::TrackParticle * combinedMuonTrack = (*muonItr)->combinedMuonTrackParticle(); const Rec::TrackParticle * extrapolatedTrack = (*muonItr)->muonExtrapolatedTrackParticle(); z0vtx = (*muonItr)->combinedMuonTrackParticle()->measuredPerigee()->localPosition()[Trk::z0] ; d0vtx = (*muonItr)->combinedMuonTrackParticle()->measuredPerigee()->localPosition()[Trk::d0] ; }

  charge = (*muonItr)->charge() ; author = (*muonItr)->author() ;
Deleted:
<
<
  fChi2OverDoF = (*muonItr)->fitChi2OverDoF() ; mChi2OverDoF = (*muonItr)->matchChi2OverDoF() ; fChi2 = (*muonItr)->fitChi2();
Line: 189 to 175
  etiso40 = (*muonItr)->parameter(MuonParameters::etcone40) ; etiso10 = (*muonItr)->parameter(MuonParameters::etcone10) ; coneiso10 = (*muonItr)->parameter(MuonParameters::nucone10) ;
Deleted:
<
<
hascombi = (*muonItr)->hasCombinedMuon() ; hasidtp = (*muonItr)->hasInDetTrackParticle() ; hasmetp = (*muonItr)->hasMuonExtrapolatedTrackParticle() ; hascmtp = (*muonItr)->hasCombinedMuonTrackParticle() ; }
 
Added:
>
>
Muons are divided into three categories:
  • Combined muons: isCombinedMuon()
  • Standalone muons: isStandAloneMuon()
  • LowPt muons: isLowPtReconstructedMuon()
 

Jet objects:

In general there are two main jet algorithms: Cone and Kt algorithms. These two algorithms have completely different ways of associating CaloClusers to jets. Jets are formed by nearby objects, where nearby refers to a distance. This distance can be either angular delta_R=sqrt(delta_eta**2 + delta_phi**2) for Cone algorithms or the relative transverse momentum K_T for the Kt algorithms.
Line: 208 to 194
 
JetCollection::const_iterator jetItr = currentJetList->begin() ;
JetCollection::const_iterator jetItrE = currentJetList->end() ;
Changed:
<
<
for (; jetItr = jetItrE; ++jetItr) { const Jet * jet = (*jetItr); px[ijet] = (*jetItr)->hlv().x() ; py[ijet] = (*jetItr)->hlv().y() ; pz[ijet] = (*jetItr)->hlv().z() ; e[ijet] = (*jetItr)->hlv().t() ; pt[ijet] = (*jetItr)->hlv().perp() ; eta[ijet] = (*jetItr)->hlv().eta() ; phi[ijet] = (*jetItr)->hlv().phi() ; }
>
>
where one can extract all possible information for example:
px = (*jetItr)->hlv().x() ;
py = (*jetItr)->hlv().y() ;
pz = (*jetItr)->hlv().z() ;
e   = (*jetItr)->hlv().t() ;
pt  = (*jetItr)->hlv().perp() ;
eta = (*jetItr)->hlv().eta() ;
phi = (*jetItr)->hlv().phi() ;
 

Tau objects:

Line: 225 to 211
 
TauJetContainer::const_iterator tauItr  = _tauList->begin();
TauJetContainer::const_iterator tauItrE = _tauList->end();
Changed:
<
<
for (; tauItr = tauItrE; ++tauItr) {
>
>
where one can extract all possible information for example:
  px = (*tauItr)->hlv().x() ; py = (*tauItr)->hlv().y() ; pz = (*tauItr)->hlv().z() ;
Line: 254 to 240
  tau1p3p_etOtherHadCells = (*tauItr)->parameter(TauJetParameters::etOtherHadCells) ; tau1p3p_discriminant = (*tauItr)->parameter(TauJetParameters::discriminant ) ; }
Deleted:
<
<
}
 

Session 3: Run a Job, Display and Analyse the Results (Histograms and Ntuples using Root)

Revision 72008-11-28 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 197 to 197
 

Jet objects:

Added:
>
>
In general there are two main jet algorithms: Cone and Kt algorithms. These two algorithms have completely different ways of associating CaloClusers to jets. Jets are formed by nearby objects, where nearby refers to a distance. This distance can be either angular delta_R=sqrt(delta_eta**2 + delta_phi**2) for Cone algorithms or the relative transverse momentum K_T for the Kt algorithms. The Cone algorithm successively merge pairs of nearby objects within a cone size of delta_R in order of decreasing pt. To avoid double counting of energy a merging/splitting method is employed. But still the Cone algorithms are neither infrared nor collinear safe.

The Kt Algorithm successively merge pairs of nearby objects in order of increasing relative transverse momentum. A single parameter "D" determines when this merging stops ("D" characterizes the size of the resulting jets). There are two modes of the Kt algorithms: inclusive mode and exculsive mode.

 
JetCollection::const_iterator jetItr = currentJetList->begin() ;
JetCollection::const_iterator jetItrE = currentJetList->end() ;

Revision 62008-11-27 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Line: 59 to 59
 For more information about the DQ2 tools, you can look at the following page: Atlas DDM page

Let's look for a particular dataset

Added:
>
>

Session 2: Basic Objects used in a Physics Analysis

At the AOD level, the user will be able to access the basic information about each object needed for physics analysis. Below, we show code snippets which would help you understand the process of retrieving the object's properties/reconstruction variables. The basic objects are:

egamma objects:

These are objects identified in the Liquid Argon calorimeter. They are a mixture of electrons and photons. One can distinguish between the two species using the information that the inner detector would provide namely track-shower matching information. Therefore and based on this information, egamma objects are further separated into photon and electron collections.

  • electrons
ElectronContainer::const_iterator elecItr  = _elecList->begin();
ElectronContainer::const_iterator elecItrE = _elecList->end();
for (; elecItr != elecItrE; ++elecItr) {
   px = (*elecItr)->hlv().x() ;
   py = (*elecItr)->hlv().y() ;
   pz = (*elecItr)->hlv().z() ;
   e = (*elecItr)->hlv().t() ;
   pt = (*elecItr)->hlv().perp() ;
   eta = (*elecItr)->hlv().eta() ;
   phi  = (*elecItr)->hlv().phi() ;

   /** Electron Track and Electron Cluster */
    const Rec::TrackParticle * track = (*elecItr)->trackParticle();
    const CaloCluster * cluster = (*elecItr)->cluster();
    const EMTrackMatch* trkmatch = (*elecItr)->detail<EMTrackMatch>(_trkMatchContainerName);
    charge = (*elecItr)->charge() ;
    elecAuthor = (*elecItr)->author(); 
    IsEM = (*elecItr)->isem() ;

    if ( trkmatch ) {
      eoverp = trkmatch->parameter(egammaParameters::EoverP) ;
      etcone = trkmatch->parameter(egammaParameters::etcone) ;
      etcone20 = trkmatch->parameter(egammaParameters::etcone20) ;
      etcone30 = trkmatch->parameter(egammaParameters::etcone30) ;
      etcone40 = trkmatch->parameter(egammaParameters::etcone40) ;
      etconoise20 = trkmatch->parameter(egammaParameters::etconoise20) ;
      etconoise30 = trkmatch->parameter(egammaParameters::etconoise30) ;
      emWeight = (*elecItr)->egammaID(egammaPID::ElectronWeight) ;
      piWeight = (*elecItr)->egammaID(egammaPID::BgWeight) ;
    }
    if ( track ) {
      z0vtx = track->measuredPerigee()->localPosition()[Trk::z0] ;
      d0vtx = track->measuredPerigee()->localPosition()[Trk::d0] ;
    }
}

  • photons
PhotonContainer::const_iterator photItr  = _photonList->begin();
PhotonContainer::const_iterator photItrE = _photonList->end();

for (; photItr != photItrE; ++photItr) {
   const EMShower* p_EMShower = (*photItr)->detail<EMShower>(_egDetailContainerName);
   px    = (*photItr)->hlv().x() ;
   py    = (*photItr)->hlv().y() ;
   pz    = (*photItr)->hlv().z() ;
   e     = (*photItr)->hlv().t() ;
   pt    = (*photItr)->hlv().perp() ;
   eta   = (*photItr)->hlv().eta() ;
   phi   = (*photItr)->hlv().phi() ;
   author   = (*photItr)->author() ;
   IsEM     =  (*photItr)->pid()->isEM() ;
   etcone   =  p_EMShower->parameter(egammaParameters::etcone) ; //  isolation energy (transverse) in a cone with half-opening angle 0.45 
   etcone20 =  p_EMShower->parameter(egammaParameters::etcone20) ;
   etcone30 =  p_EMShower->parameter(egammaParameters::etcone30) ;
   etcone40 =  p_EMShower->parameter(egammaParameters::etcone40) ;
   etconoise20 =  p_EMShower->parameter(egammaParameters::etconoise20) ; //isolation transverse energy in a ring 0.1<Dr<0.2, above 3 sigma of total noise, available from rel 14.0.0
   etconoise30 =  p_EMShower->parameter(egammaParameters::etconoise30) ;
   emwgt    =  (*photItr)->egammaID(egammaPID::ElectronWeight) ;
   piwgt    =  (*photItr)->egammaID(egammaPID::BgWeight) ;
}

Muon objects:

MuonContainer::const_iterator muonItr  = _muidMuonList->begin();
MuonContainer::const_iterator muonItrE = _muidMuonList->end();
       OR
MuonContainer::const_iterator muonItr  = _stacoMuonList->begin();
MuonContainer::const_iterator muonItrE = _stacoMuonList->end();

for (; muonItr != muonItrE; ++muonItr) {
   px       = (*muonItr)->hlv().x() ;
   py       = (*muonItr)->hlv().y() ;
   pz       = (*muonItr)->hlv().z() ;
   e        = (*muonItr)->hlv().t() ;
   pt       = (*muonItr)->hlv().perp() ;
   eta      = (*muonItr)->hlv().eta() ;
   phi      = (*muonItr)->hlv().phi() ;

   /** Muon Author - hight or low Pt reconstruction algorithms */
   double muonAuthor = -1.0;
   if ( (*muonItr)->isHighPt() && (*muonItr)->hasMuonExtrapolatedTrackParticle() ) muonAuthor = 0.0;
   else if ( (*muonItr)->isHighPt() && !(*muonItr)->hasMuonExtrapolatedTrackParticle() ) muonAuthor = 1.0;
   else if ( (*muonItr)->isLowPt() ) muonAuthor = 2.0;
   /** Muon Status - standalone, combined, lowPt */
   double muonStatus = -1.0;
   if ( (*muonItr)->hasCombinedMuonTrackParticle() && !(*muonItr)->bestMatch() ) 
      muonStatus = 0.0; // has a combined track but it is not the best matched
   else if ( (*muonItr)->isCombinedMuon() ) 
       muonStatus = 1.0; // combined muons
   else if ( (*muonItr)->isStandAloneMuon() )
       muonStatus = 2.0; // standalone muon
   else if ( (*muonItr)->isLowPtReconstructedMuon() ) 
       muonStatus = 3.0; // lowPt reconsrtructed muon

   /** Muon PoverP and access to the muon associated trackParticles You can also access the 
        associated Inner TrackParticle in case of lowPt reconstructed or combined Track */
   if ( (*muonItr)->hasCombinedMuonTrackParticle() ) {
     const Rec::TrackParticle * muonSpectrometerTrack = (*muonItr)->muonSpectrometerTrackParticle();
     onst Rec::TrackParticle * combinedMuonTrack     = (*muonItr)->combinedMuonTrackParticle();
     const Rec::TrackParticle * extrapolatedTrack     = (*muonItr)->muonExtrapolatedTrackParticle();
     z0vtx  =  (*muonItr)->combinedMuonTrackParticle()->measuredPerigee()->localPosition()[Trk::z0] ;
     d0vtx  =  (*muonItr)->combinedMuonTrackParticle()->measuredPerigee()->localPosition()[Trk::d0] ;
   }
   charge   = (*muonItr)->charge() ;
   author   = (*muonItr)->author() ;

   fChi2OverDoF     =  (*muonItr)->fitChi2OverDoF() ;
   mChi2OverDoF     = (*muonItr)->matchChi2OverDoF() ;
   fChi2     = (*muonItr)->fitChi2();
   mChi2     = (*muonItr)->matchChi2() ;
   bestM   = (*muonItr)->bestMatch();
   coneiso20 =  (*muonItr)->parameter(MuonParameters::nucone20) ;
   coneiso30 =  (*muonItr)->parameter(MuonParameters::nucone30) ;
   coneiso40 =  (*muonItr)->parameter(MuonParameters::nucone40) ;
   etiso20   =  (*muonItr)->parameter(MuonParameters::etcone20) ;
   etiso30   =  (*muonItr)->parameter(MuonParameters::etcone30) ;
   etiso40   =  (*muonItr)->parameter(MuonParameters::etcone40) ;
   etiso10   =  (*muonItr)->parameter(MuonParameters::etcone10) ;
   coneiso10 =  (*muonItr)->parameter(MuonParameters::nucone10) ;
   hascombi =  (*muonItr)->hasCombinedMuon() ;
   hasidtp  =  (*muonItr)->hasInDetTrackParticle() ;
   hasmetp  =  (*muonItr)->hasMuonExtrapolatedTrackParticle() ;
   hascmtp  =  (*muonItr)->hasCombinedMuonTrackParticle() ;
}

Jet objects:

JetCollection::const_iterator jetItr = currentJetList->begin() ;
JetCollection::const_iterator jetItrE = currentJetList->end() ;
for (; jetItr != jetItrE; ++jetItr) {
   const Jet * jet = (*jetItr);
   px[ijet]    = (*jetItr)->hlv().x() ;
   py[ijet]    = (*jetItr)->hlv().y() ;
   pz[ijet]    = (*jetItr)->hlv().z() ;
   e[ijet]     = (*jetItr)->hlv().t() ;
   pt[ijet]    = (*jetItr)->hlv().perp() ;
   eta[ijet]   = (*jetItr)->hlv().eta() ;
   phi[ijet] = (*jetItr)->hlv().phi() ;
}

Tau objects:

TauJetContainer::const_iterator tauItr  = _tauList->begin();
TauJetContainer::const_iterator tauItrE = _tauList->end();
for (; tauItr != tauItrE; ++tauItr) {
   px     = (*tauItr)->hlv().x() ;
   py     = (*tauItr)->hlv().y() ;
   pz     = (*tauItr)->hlv().z() ;
   e      = (*tauItr)->hlv().t() ;
   pt     = (*tauItr)->hlv().perp() ;
   eta    = (*tauItr)->hlv().eta() ;
   phi    = (*tauItr)->hlv().phi() ;
   ntrk   = (*tauItr)->numTrack();
   charge = (*tauItr)->charge() ;
   author = (*tauItr)->author(); // can be either TauJetParameters::tauRec or TauJetParameters::tau1P3P

   //--/ Additional tau-identification parameters that are algorithm-dependent

   if( (*tauItr)->author() == TauJetParameters::tauRec){  // tauRec
      tauRec_logLikelihoodRatio       = (*tauItr)->parameter(TauJetParameters::logLikelihoodRatio) ;
      tauRec_lowPtTauJetDiscriminant  = (*tauItr)->parameter(TauJetParameters::lowPtTauJetDiscriminant) ;
      tauRec_lowPtTauEleDiscriminant  = (*tauItr)->parameter(TauJetParameters::lowPtTauEleDiscriminant) ;
      tauRec_tauJetNeuralnetwork      = (*tauItr)->parameter(TauJetParameters::tauJetNeuralnetwork) ;
      tauRec_tauENeuralNetwork        = (*tauItr)->parameter(TauJetParameters::tauENeuralNetwork) ;
    }
    if( (*tauItr)->author()  == TauJetParameters::tau1P3P){  // Tau1P3P
      tau1p3p_annularIsolationFraction=  (*tauItr)->parameter(TauJetParameters::annularIsolationFraction) ;
      tau1p3p_etCaloAtEMScale         =  (*tauItr)->parameter(TauJetParameters::etCaloAtEMScale) ;         
      tau1p3p_etChargedHadCells       =  (*tauItr)->parameter(TauJetParameters::etChargedHadCells ) ;      
      tau1p3p_etOtherEMCells          =  (*tauItr)->parameter(TauJetParameters::etOtherEMCells ) ;         
      tau1p3p_etOtherHadCells         =  (*tauItr)->parameter(TauJetParameters::etOtherHadCells) ;         
      tau1p3p_discriminant            =  (*tauItr)->parameter(TauJetParameters::discriminant ) ;           
    }
}

Session 3: Run a Job, Display and Analyse the Results (Histograms and Ntuples using Root)

 

Start Running a simple job:

thumbs up

Deleted:
<
<

Session 2: Basic Objects used in a Physics Analysis

Session 3: Display and Analyse the Results (Histograms and Ntuples using ROOT)

 

Session 4:

-- ChaoukiB - 24 Nov 2008 \ No newline at end of file

Revision 32008-11-27 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Added:
>
>
 

Scope of the Tutorial:

The goal of this tutorial is to make sure that by the end of the workshop, the participants will be able to run ATLAS analysis on their own. There are two important requirements that need to be satisfied before the start of the tutorial, namely:

Line: 16 to 18
 
  • Use the GRID (PAnda, Ganga) to generate MC samples, analyze data, etc...

Session 1:

Changed:
<
<

Setup your work area

>
>

Setup your work area on lxplus

  In your scratch directory (${HOME}/scratch0/):
  • Create a directory called cmthome: " mkdir cmthome"
Line: 46 to 48
  TIP this command needs to be done every time you open a new shell/xterm.

Browse/Retrieve Datasets over the GRID

Changed:
<
<
roll eyes (sarcastic) Although, some of you do not have GRID, we will present the necessary steps to browse and retrieve the data.
>
>
Although, some of you do not have GRID certificate roll eyes (sarcastic) , we will present the necessary steps to browse and retrieve the data. Once the GRID certificates is granted (ATLAS VO registration), they should be placed in the directory:
${HOME}/.globus
After installing the certificates thumbs up , you can initiate a proxy session by typing:
grid-proxy-init
Next, you need to setup the required environment to be able to use the DQ2 tools needed for browsing/retrieving data over the GRID, namely:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
Now, you are ready to view ATLAS datasets which are stored in multiple locations on the GRID. The most useful commands are:
  • dq2-ls: used to browse a particular dataset or datasets which contain particular string, example: dq2-ls '*.PythiaZee.*'
  • dq2-get: used to retrieve the data over the GRID and store it locally.
For more information about the DQ2 tools, you can look at the following page: Atlas DDM page
 

Start Running a simple job:

thumbs up

Changed:
<
<

Session 2: "Being Written"

>
>

Session 2:

Session 3:

Session 4:

  -- ChaoukiB - 24 Nov 2008

Revision 22008-11-26 - ChaoukiB

Line: 1 to 1
 
META TOPICPARENT name="ChaoukiB"
Changed:
<
<
The goal of this tutorial is to make sure that by the end of the workshop, the participants will be able to run ATLAS analysis on their own. There are two important requirements that need to be satisfied before the start of the tutorial, namely:
>
>

Scope of the Tutorial:

The goal of this tutorial is to make sure that by the end of the workshop, the participants will be able to run ATLAS analysis on their own. There are two important requirements that need to be satisfied before the start of the tutorial, namely:

 
  • Have an account on Lxplus.
  • Have a valid grid certificate to be able to run analysis using the GRID.
Line: 15 to 15
 
  • Use the physics objects in an AOD analysis.
  • Use the GRID (PAnda, Ganga) to generate MC samples, analyze data, etc...
Added:
>
>

Session 1:

Setup your work area

In your scratch directory (${HOME}/scratch0/):

  • Create a directory called cmthome: " mkdir cmthome"
  • Create a directory called IctpTutorial: "
    mkdir IctpTutorial
  • Create a directory under IctpTutorial which corresponds to the ATLAS release being used:
    mkdir IctpTutorial/14.2.21
    . This will hold your Algorithms and any checked out code .
  • Copy the requirement file from my home area:
    cp ~chaouki/scratch0/cmthome/requirement ${HOME}/scratch0/cmthome/
    where the requirements file is needed to define athena release version together with the necessary properties:
      
       set CMTSITE CERN
       set SITEROOT /afs/cern.ch
       macro ATLAS_DIST_AREA ${SITEROOT}/atlas/software/dist
       # use optimised version by default 
       macro ATLAS_TEST_AREA    "${HOME}/scratch0/IctpTutorial/14.2.21" \
       14.2.21            "${HOME}/scratch0/IctpTutorial/14.2.21"
       use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA)
       
  • Copy and run the script cmt_version.sh, which sets up CMT.
       cp ~chaouki/scratch0/cmthome/cmt_version.sh ${HOME}/scratch0/cmthome/
       cd ${HOME}/scratch0/cmthome/ 
       source cmt_version.sh
       
    TIP this command needs to be done only once.
  • Finally copy and run the script run_setup.sh to setup the release environment
       cp ~chaouki/scratch0/cmthome/run_setup.sh ${HOME}/scratch0/cmthome/
       source run_setup.sh 
       
    TIP this command needs to be done every time you open a new shell/xterm.

Browse/Retrieve Datasets over the GRID

roll eyes (sarcastic) Although, some of you do not have GRID, we will present the necessary steps to browse and retrieve the data.

Start Running a simple job:

thumbs up

Session 2: "Being Written"

  -- ChaoukiB - 24 Nov 2008

Revision 12008-11-24 - ChaoukiB

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="ChaoukiB"
The goal of this tutorial is to make sure that by the end of the workshop, the participants will be able to run ATLAS analysis on their own. There are two important requirements that need to be satisfied before the start of the tutorial, namely:

  • Have an account on Lxplus.
  • Have a valid grid certificate to be able to run analysis using the GRID.

We will cover the following main areas:

  • Setup the directory structure.
  • Setup a particular version of the ATLAS software releases.
  • A brief explanation of the process to generate particular MC samples.
  • Understand various physics analysis objects (how to access more information about each object).
  • Use the physics objects in an AOD analysis.
  • Use the GRID (PAnda, Ganga) to generate MC samples, analyze data, etc...

-- ChaoukiB - 24 Nov 2008

 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback