TWiki
>
Main Web
>
TWikiUsers
>
MichiruKaneda
>
NTUPtoNTUP
>
NTUPtoNTUPOld
(2012-08-13,
MichiruKaneda
)
(raw view)
E
dit
A
ttach
P
DF
<!-- This is the default ATLAS template. Please modify it in the sections indicated to create your topic! In particular, notice that at the bottom there are some sections that must be filled for publicly accessible pages. If you have any comments/complaints about this template, then please email : Patrick Jussel (patrick dot jussel at cern dot ch) and/or Maria Smizanska (maria dot smizanska at cern dot ch) By default the title is the WikiWord used to create this topic if you want to modify it to something more meaningful, just replace %TOPIC% below with i.e "My Topic" =========== you can remove the lines above ============================================= --> %CERTIFY% ---+!! <nop>%TOPIC% %TOC% <!-- this line is optional --> %STARTINCLUDE% ---+ Introduction (17.2.3.1) This page describes how to skim/slim NTUP (!D3PD) in athena with a view to run jobs in the central production system of ATLAS. See also this tutorial :[[https://twiki.cern.ch/twiki/bin/viewauth/Atlas/SoftwareTutorialAnalyzingD3PDsInAthena][SoftwareTutorialAnalyzingD3PDsInAthena]] for the details of reading/writing !D3PDs in athena. <!--<br /> * Set inputNTUP_SMWZFile = root://eosatlas//eos/atlas/atlaslocalgroupdisk/scratch/mkaneda/data/valid1.106046.PythiaZee_no_filter.merge.NTUP_SMWZ.e815_s1272_s1274_r3397_p948_tid752536_00/NTUP_SMWZ.752536._000068.root.1 <br /> * Set inputNTUP_SMWZFile_2 = root://eosatlas//eos/atlas/atlaslocalgroupdisk/scratch/mkaneda/data/valid1.105200.T1_McAtNlo_Jimmy.merge.NTUP_SMWZ.e835_s1310_s1300_r3397_p948_tid752537_00/NTUP_SMWZ.752537._000074.root.1<br /><br />--> ---+ Athena setup Please follow this [[https://twiki.cern.ch/twiki/bin/viewauth/Atlas/SoftwareTutorialSoftwareBasics][instruction]]. In this page, we use release AtlasPhysics-17.2.3.7.1. Make test area like: <pre>mkdir -p $HOME/testarea/17.2.3.7.1 cd $HOME/testarea/17.2.3.7.1 export AtlasSetup=/afs/cern.ch/atlas/software/dist/AtlasSetup alias asetup='source $AtlasSetup/scripts/asetup.sh' asetup AtlasPhysics,17.2.3.7.1, here </pre> <!--<br />In addition, check out !PyUtils-00-12-06 for scripts to make !D3PDObjects below.<br />�pre1�<br />--> ---+ Checking out core package and example package Check out and make pacakges: !NTUPtoNTUPCore and NTUPtoNTUPExample <pre>cd $TestArea cmt co -r NTUPtoNTUPCore-00-00-01 PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPCore cmt co -r NTUPtoNTUPExample-00-00-03 PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample setupWorkArea.py cd WorkArea/cmt/ cmt broadcast cmt config cmt broadcast source setup.sh cmt broadcast make </pre> There are two different methods of skim/slim. * *NTUPtoNTUP method* * It uses the interface of root reading in athena (AthenaRootComps) . * Main script is: !NTUPtoNTUPCore/scripts/NTUPtoNTUP_trf.py * Outputs are defined in: !NTUPtoNTUPCore/python/NTUPtoNTUPProdFlags.py * It can use may athena infrastructures for monitoring or debuging, etc... * Known problem * !CutFlowTree can't be stored * It can't run on recent data... * *SkimNTUP method* * It uses non-athena event loop. * Main script is: !NTUPtoNTUPCore/scripts/SkimNTUP_trf.py * Outputs are defined in: !NTUPtoNTUPCore/python/SkimNTUPProdFlags.py ---+ Running example ---++ Running example 1: NTUPtoNTUP method, using jobOptions file <pre>mkdir -p $TestArea/run/test1 cd $TestArea/run/test1 get_files NTUPtoMyNTUP.py athena NTUPtoMyNTUP.py >log 2>&1 </pre> Check: myNtup.root is new file "NTUPtoNTUPExample/MyNTUP_prodJobOFragment.py" is the key file to setup output file. ---++ Running example 2: NTUPtoNTUP method, using job transform script <pre>mkdir -p $TestArea/run/test2 cd $TestArea/run/test2 NTUPtoNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYNTUPFile=myNtup.root outputNTUP_MYNTUP2File=myNtup2.root >log 2>&1 </pre> Check: There should be two files: myNtup.root and myNtup2.root "NTUPtoNTUPExample/MyNTUP_prodJobOFragment.py" adn "NTUPtoNTUPExample/MyNTUP2_prodJobOFragment.py" are the key files to setup output files. ---++ Running example 3: SkimNTUP method, using job transform script <pre>mkdir -p $TestArea/run/testSkim1 cd $TestArea/run/testSkim1 SkimNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYSKIMNTUPFile=myTestNtup.root >log 2>&1 </pre> The script file for skimming/slimming is !PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/python/skim.py, which has !PyROOT function, doSkim().<br /><br />!JobOFragment file is !PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/python/MySkimNTUP_prodJobOFragment.py, which calls doSkim().<br /><br />To add new type, add new definition to !PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/python/SkimNTUP_ProdFlags.py as described above for !NTUPtoNTUP. ---+ Tutorial for NTUPtoNTUP method ---++ Adding new output definition Add your output definition at the bottom of !NTUPtoNTUPExample/python/NTUPtoNTUPProdFlags.py: <br /><div style="background-color:#eee; padding:.1em"><br />%CODE{"python"}% <br />class WriteMyTestNTUP (JobProperty):<br />"""test NTUP""" <br /> statusOn = True<br /> allowedTypes = ['bool']<br /> StoredValue = False<br /> StreamName = 'StreamNTUP_MYTESTNTUP'<br /> FileName = '' NTUPScript = "MyTestNTUP/MyTestNTUP_prodJobOFragment.py"<br /> TreeNames = ['physics']<br /> SubSteps = ['n2n']<br />prodFlags.add_JobProperty (WriteMyTestNTUP)<br />listAllKnownNTUPtoNTUP.append (prodFlags.WriteMyTestNTUP) <br />%ENDCODE%<br /></div> * !MyTestNTUP/MyTestNTUP_prodJobOFragment.py: package name and job fragment files which will be made in the next section * !TreeNames: tree name of ntuple * !StreamName: Stream name. It is also used to define the argument name for !NTUPtoNTUP_trf.py. argument name is "outputNTUP_MYTESTNTUPFile" in this case. ---++ Preparing new package and job fragment file Make new package to construct your new ntuple: <pre>cd $TestArea acmd.py cmt new-pkg Tutorial/MyTestNTUP </pre> Make jobFragment file: Tutorial/MyTestNTUP/share/MyNTUPTest_prodJobFragment.py <pre>cd Tutorial/MyTestNTUP/share/ </pre> * !MyNTUPTest_prodJobFragment.py (simple example to copy a few branches from original ntuple) <div style="background-color:#eee; padding:.1em"><br />%CODE{"python"}% <br />## This jobO should not be included more than once:<br />include.block( "MyNTUPTest/MyNTUPTest_prodJobFragment.py" )<br />## Common import(s):<br />from AthenaCommon.AlgSequence import AlgSequence<br />topSequence = AlgSequence()<br />from AthenaCommon.JobProperties import jobproperties<br />prodFlags = jobproperties.NTUPtoNTUPProdFlags<br />from PrimaryDPDMaker.PrimaryDPDHelpers import buildFileName<br /><br />mytestntup=prodFlags.WriteMyNTUPTest<br /><br />## Set up a logger:<br />from AthenaCommon.Logging import logging<br />MyNTUPTestStream_msg = logging.getLogger( 'MyNTUPTest_prodJobOFragment' )<br /><br />## Construct the stream and file names for the SUSY NTUP:<br />streamName = mytestntup.StreamName<br />fileName = buildFileName( mytestntup )<br />MyNTUPTestStream_msg.info( "Configuring MyNTUPTest with streamName '%s' and fileName '%s'" % \<br /> ( streamName, fileName ) )<br /><br />## set input tree:<br />from AthenaCommon.JobProperties import jobproperties<br />ntupFlags=jobproperties.NTUPtoNTUPProdFlags<br />tree_name=ntupFlags.TreeName()<br /><br />## Create the NTUP streams:<br />from NTUPtoNTUPExample.MultipleNTUPStreamManager import MNSMgr<br />MyNTUPTestStream = MNSMgr.NewNTUPStream( streamName, fileName, tree_name)<br />MyNTUPTestStream.AddItem([<br /> "el_n", # add el_n<br /> "el_pt", # add el_pt<br /> "el_Etcone*", # add any branches matching el_Etcone*<br /> "-el_Etcone15", # remove el_Etcone15<br /> ])<br />%ENDCODE%<br /></div> The branches in new ntuple is defined by the line "MyNTUPTestStream.AddItem([...])". Compile the package: <pre>cd ../cmt/ sed -i "s/apply_pattern component_library/#apply_pattern component_library/g" requirements # comment out library requirement for the moment make</pre> And test new output: <pre>mkdir -p $TestArea/run/test3 cd $TestArea/run/test3 NTUPtoNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYTTESTNTUPFile=myTestNtup.root >log 2>&1 </pre> Check output file: <pre> acmd.py dump-root filtered.myTestNtup.root -t physics </pre> ---++ Skimming/Adding new branches This section shows example of Z->ee selection. * Select only electrons pt > 20GeV * Select events with more than one electrons and 60GeV<m_Z(ee)<120GeV ---+++ Making new algorithm to skim events and add new branches <pre>cd $TestArea/Tutorial/MyTestNTUP/src #Make skeleton files (MyTestNTUPAlg.h/cxx) acmd.py gen-klass --klass MyTestZeeAlg --pkg MyTestNTUP --type alg -o MyTestZeeAlg #make load file mkdir -p components cat >| components/MyTestNTUP_load.cxx << EOF #include "GaudiKernel/LoadFactoryEntries.h" LOAD_FACTORY_ENTRIES( MyTestNTUP ) EOF #make entries file cat >| components/MyTestNTUP_entries.cxx << EOF #include "GaudiKernel/DeclareFactoryEntries.h" #include "../MyTestZeeAlg.h" DECLARE_ALGORITHM_FACTORY (MyTestZeeAlg) DECLARE_FACTORY_ENTRIES( MyTestNTUP ) { DECLARE_ALGORITHM(MyTestZeeAlg) } EOF </pre> ---+++ Preparing !D3PDObject Prepare !D3PDObject code from !D3PD: <pre>cd $TestArea mkdir -p run/d3pd cd run/d3pd atl-gen-athena-d3pd-reader %inputNTUP_SMWZFile%</pre> There will be codes: code/*D3PDObject.cxx/h. Copy !ElectronD3PDObject to !MyTestNTUP package <pre>cd $TestArea/Tutorial/MyTestNTUP/src cp $TestArea/run/d3pd/code/ElectronD3PDObject* .</pre> ---+++ Writing Z->ee algorithm <pre>cd $TestArea/Tutorial/MyTestNTUP/src </pre> Edit !MyTestZeeAlg.* like: [[%ATTACHURL%/MyTestZeeAlg.cxx][MyTestZeeAlg.cxx]], [[%ATTACHURL%/MyTestZeeAlg.h][MyTestZeeAlg.h]] * RVar/WVar is used to read/write single variable from/to !StoreGate * el_* variables are retrieved as !ElectronD3PDObject: const !ElectronD3PDObject el("el_"), and can be used like: el.pt(i) * New electrons are stored as "myEl_", by using !ElectronD3PDObject * In the function execute(), setFilterPassed function is used to decide whether the event should be passed or not. Go to share directory and edit !MyTestNTUP _prodJobOFragment.py. <pre>cd $TestArea/Tutorial/MyTestNTUP/share</pre> Add "myEl_*" and "my_mZ" to !AddItem list and add filter algorithm: <br /><div style="background-color:#eee; padding:.1em"><br />%CODE{"python"}% <br />...<br />MyNTUPTestStream.AddItem([<br /> "el_n", # add el_n<br /> "el_pt", # add el_pt<br /> "el_Etcone*", # add any branches matching el_Etcone*<br /> "-el_Etcone15", # remove el_Etcone15<br /> "myEl_*", # new electron<br /> "my_mZ", # Z mass<br /> ])<br />## Algorithm for filter and new variables<br />from MyTestNTUP.MyTestNTUPConf import MyTestZeeAlg<br />ZeeAlg=MyTestZeeAlg("MyTestZeeAlg",<br /> MinNumberPassed = 2,<br /> PtMin = 20.*GeV,<br /> MZMin = 60.*GeV,<br /> MZMax = 160.*GeV,<br /> )<br />topSequence+=ZeeAlg<br />MyTestNTUPStream.AddRequireAlgs(ZeeAlg.getName())<br />%ENDCODE%<br /></div> Go to cmt directory, and udpate requirements file: [[%ATTACHURL%/requirements][requirements]], and make. <pre>cd $TestArea/Tutorial/MyTestNTUP/cmt make</pre> And test new output: <pre>mkdir -p $TestArea/run/test4 cd $TestArea/run/test4 NTUPtoNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYTTESTNTUPFile=myTestNtup.root >log 2>&1</pre> In the log file, you can find a result of the filter <pre>MyTestZeeAlg INFO Finalizing MyTestZeeAlg... MyTestZeeAlg INFO ******Summary****** MyTestZeeAlg INFO Number of processed events: 10000 MyTestZeeAlg INFO Number of passed events: 7287 MyTestZeeAlg INFO Average number of electrons (all) : 4.5943 MyTestZeeAlg INFO Average number of electrons (rem): 2.04556 MyTestZeeAlg INFO *******************</pre> And check the contents if it has correctly values "myEl_" and "my_mZ" ---+ Introduction(17.2.1) This page describes how to skim/slim NTUP (!D3PD) in athena with a view to run jobs in the central production system of ATLAS. See also this tutorial :[[https://twiki.cern.ch/twiki/bin/viewauth/Atlas/SoftwareTutorialAnalyzingD3PDsInAthena][SoftwareTutorialAnalyzingD3PDsInAthena]] for the details of reading/writing !D3PDs in athena. <!-- * Set inputNTUP_SMWZFile = root://castoratlas//castor/cern.ch/grid/atlas/atlt3/scratch/mkaneda/data/valid1.106046.PythiaZee_no_filter.merge.NTUP_SMWZ.e815_s1272_s1274_r3397_p948_tid752536_00/NTUP_SMWZ.752536._000068.root.1 * Set inputNTUP_SMWZFile_2 = root://castoratlas//castor/cern.ch/grid/atlas/atlt3/scratch/mkaneda/data/valid1.105200.T1_McAtNlo_Jimmy.merge.NTUP_SMWZ.e835_s1310_s1300_r3397_p948_tid752537_00/NTUP_SMWZ.752537._000074.root.1 --> ---+ Athena setup Please follow this [[https://twiki.cern.ch/twiki/bin/viewauth/Atlas/SoftwareTutorialSoftwareBasics][instruction]]. In this page, we use release 17.2.1. Make test area like: <pre>mkdir -p $HOME/testarea/17.2.1 cd $HOME/testarea/17.2.1 export AtlasSetup=/afs/cern.ch/atlas/software/dist/AtlasSetup alias asetup='source $AtlasSetup/scripts/asetup.sh' asetup 17.2.1, here </pre> In addition, check out !PyUtils-00-12-06 for scripts to make !D3PDObjects below. <pre>cmt co -r PyUtils-00-12-06 Tools/PyUtils cd Tools/PyUtils/cmt make<verbatim> </verbatim> </pre> ---+ Checking out main package and running examples Checking out main pacakge: !NTUPtoNTUPExample <pre>cd $TestArea cmt co -r NTUPtoNTUPExample-00-00-03 -o svn+ssh://svn.cern.ch/reps/atlasusr/mkaneda/atlasoff/ PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/cmt/ make </pre> Running example 1: using jobOptions file <pre>mkdir -p $TestArea/run/test1 cd $TestArea/run/test1 get_files NTUPtoMyNTUP.py athena NTUPtoMyNTUP.py >log 2>&1 </pre> Check: myNtup.root is new file Running example 2: using job transform script <pre>mkdir -p $TestArea/run/test2 cd $TestArea/run/test2 NTUPtoNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYNTUPFile=myNtup.root outputNTUP_MYNTUP2File=myNtup2.root >log 2>&1 </pre> Check: There should be two files: myNtup.root and myNtup2.root In both cases, "NTUPtoNTUPExample/MyNTUP_prodJobOFragment.py" is the key file to setup output file (myNtup.root). ---+ Adding new output definition Add your output definition at the bottom of !NTUPtoNTUPExample/python/NTUPtoNTUPProdFlags.py: <sticky> <div style="background-color:#eee; padding:.1em"> %CODE{"python"}% class WriteMyTestNTUP (JobProperty): """test NTUP""" statusOn = True allowedTypes = ['bool'] StoredValue = False StreamName = 'StreamNTUP_MYTESTNTUP' FileName = '' NTUPScript = "MyTestNTUP/MyTestNTUP_prodJobOFragment.py" TreeNames = ['physics'] SubSteps = ['n2n'] prodFlags.add_JobProperty (WriteMyTestNTUP) listAllKnownNTUPtoNTUP.append (prodFlags.WriteMyTestNTUP) %ENDCODE% </div></sticky> * !MyTestNTUP/MyTestNTUP_prodJobOFragment.py: package name and job fragment files which will be made in the next section * !TreeNames: tree name of ntuple * !StreamName: Stream name. It is also used to define the argument name for !NTUPtoNTUP_trf.py. argument name is "outputNTUP_MYTESTNTUPFile" in this case. ---+ Preparing new package and job fragment file Make new package to construct your new ntuple: <pre>cd $TestArea acmd.py cmt new-pkg Tutorial/MyTestNTUP </pre> Make jobFragment file: Tutorial/MyTestNTUP/share/MyNTUPTest_prodJobFragment.py <pre>cd Tutorial/MyTestNTUP/share/ </pre> * !MyNTUPTest_prodJobFragment.py (simple example to copy a few branches from original ntuple) <sticky> <div style="background-color:#eee; padding:.1em"> %CODE{"python"}% ## This jobO should not be included more than once: include.block( "MyNTUPTest/MyNTUPTest_prodJobFragment.py" ) ## Common import(s): from AthenaCommon.AlgSequence import AlgSequence topSequence = AlgSequence() from AthenaCommon.JobProperties import jobproperties prodFlags = jobproperties.NTUPtoNTUPProdFlags from PrimaryDPDMaker.PrimaryDPDHelpers import buildFileName mytestntup=prodFlags.WriteMyNTUPTest ## Set up a logger: from AthenaCommon.Logging import logging MyNTUPTestStream_msg = logging.getLogger( 'MyNTUPTest_prodJobOFragment' ) ## Construct the stream and file names for the SUSY NTUP: streamName = mytestntup.StreamName fileName = buildFileName( mytestntup ) MyNTUPTestStream_msg.info( "Configuring MyNTUPTest with streamName '%s' and fileName '%s'" % \ ( streamName, fileName ) ) ## set input tree: from AthenaCommon.JobProperties import jobproperties ntupFlags=jobproperties.NTUPtoNTUPProdFlags tree_name=ntupFlags.TreeName() ## Create the NTUP streams: from NTUPtoNTUPExample.MultipleNTUPStreamManager import MNSMgr MyNTUPTestStream = MNSMgr.NewNTUPStream( streamName, fileName, tree_name) MyNTUPTestStream.AddItem([ "el_n", # add el_n "el_pt", # add el_pt "el_Etcone*", # add any branches matching el_Etcone* "-el_Etcone15", # remove el_Etcone15 ]) %ENDCODE% </div></sticky> The branches in new ntuple is defined by the line "MyNTUPTestStream.AddItem([...])". Compile the package: <pre>cd ../cmt/ sed -i "s/apply_pattern component_library/#apply_pattern component_library/g" requirements # comment out library requirement for the moment make</pre> And test new output: <pre>mkdir -p $TestArea/run/test3 cd $TestArea/run/test3 NTUPtoNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYTTESTNTUPFile=myTestNtup.root >log 2>&1 </pre> Check output file: <pre> acmd.py dump-root filtered.myTestNtup.root -t physics </pre> ---+ Skimming/Adding new branches This section shows example of Z->ee selection. * Select only electrons pt > 20GeV * Select events with more than one electrons and 60GeV<m_Z(ee)<120GeV ---++ Making new algorithm to skim events and add new branches <pre>cd $TestArea/Tutorial/MyTestNTUP/src #Make skeleton files (MyTestNTUPAlg.h/cxx) acmd.py gen-klass --klass MyTestZeeAlg --pkg MyTestNTUP --type alg -o MyTestZeeAlg #make load file mkdir -p components cat >| components/MyTestNTUP_load.cxx << EOF #include "GaudiKernel/LoadFactoryEntries.h" LOAD_FACTORY_ENTRIES( MyTestNTUP ) EOF #make entries file cat >| components/MyTestNTUP_entries.cxx << EOF #include "GaudiKernel/DeclareFactoryEntries.h" #include "../MyTestZeeAlg.h" DECLARE_ALGORITHM_FACTORY (MyTestZeeAlg) DECLARE_FACTORY_ENTRIES( MyTestNTUP ) { DECLARE_ALGORITHM(MyTestZeeAlg) } EOF </pre> ---++ Preparing !D3PDObject Prepare !D3PDObject code from !D3PD: <verbatim>cd $TestArea mkdir -p run/d3pd cd run/d3pd atl-gen-athena-d3pd-reader root://castoratlas//castor/cern.ch/grid/atlas/atlt3/scratch/mkaneda/data/valid1.105200.T1_McAtNlo_Jimmy.merge.NTUP_SMWZ.e835_s1310_s1300_r3397_p948_tid752537_00/NTUP_SMWZ.752537._000074.root.1</verbatim> root://castoratlas//castor/cern.ch/grid/atlas/atlt3/scratch/mkaneda/data/valid1.106046.PythiaZee_no_filter.merge.NTUP_SMWZ.e815_s1272_s1274_r3397_p948_tid752536_00/NTUP_SMWZ.752536._000068.root There will be codes: code/*D3PDObject.cxx/h. Copy !ElectronD3PDObject to !MyTestNTUP package <verbatim>cd $TestArea/Tutorial/MyTestNTUP/src cp $TestArea/run/d3pd/code/ElectronD3PDObject* .</verbatim> ---++ Writing Z->ee algorithm <pre>cd $TestArea/Tutorial/MyTestNTUP/src </pre> Edit !MyTestZeeAlg.* like: [[%ATTACHURL%/MyTestZeeAlg.cxx][MyTestZeeAlg.cxx]], [[%ATTACHURL%/MyTestZeeAlg.h][MyTestZeeAlg.h]] * RVar/WVar is used to read/write single variable from/to !StoreGate * el_* variables are retrieved as !ElectronD3PDObject: const !ElectronD3PDObject el("el_"), and can be used like: el.pt(i) * New electrons are stored as "myEl_", by using !ElectronD3PDObject * In the function execute(), setFilterPassed function is used to decide whether the event should be passed or not. Go to share directory and edit !MyTestNTUP _prodJobOFragment.py. <pre>cd $TestArea/Tutorial/MyTestNTUP/share</pre> Add "myEl_*" and "my_mZ" to !AddItem list and add filter algorithm: <sticky> <div style="background-color:#eee; padding:.1em"> %CODE{"python"}% ... MyNTUPTestStream.AddItem([ "el_n", # add el_n "el_pt", # add el_pt "el_Etcone*", # add any branches matching el_Etcone* "-el_Etcone15", # remove el_Etcone15 "myEl_*", # new electron "my_mZ", # Z mass ]) ## Algorithm for filter and new variables from MyTestNTUP.MyTestNTUPConf import MyTestZeeAlg ZeeAlg=MyTestZeeAlg("MyTestZeeAlg", MinNumberPassed = 2, PtMin = 20.*GeV, MZMin = 60.*GeV, MZMax = 160.*GeV, ) topSequence+=ZeeAlg MyTestNTUPStream.AddRequireAlgs(ZeeAlg.getName()) %ENDCODE% </div></sticky> Go to cmt directory, and udpate requirements file: [[%ATTACHURL%/requirements][requirements]], and make. <pre>cd $TestArea/Tutorial/MyTestNTUP/cmt make</pre> And test new output: <verbatim>mkdir -p $TestArea/run/test4 cd $TestArea/run/test4 NTUPtoNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYTTESTNTUPFile=myTestNtup.root >log 2>&1</verbatim> In the log file, you can find a result of the filter <verbatim>MyTestZeeAlg INFO Finalizing MyTestZeeAlg... MyTestZeeAlg INFO ******Summary****** MyTestZeeAlg INFO Number of processed events: 10000 MyTestZeeAlg INFO Number of passed events: 7287 MyTestZeeAlg INFO Average number of electrons (all) : 4.5943 MyTestZeeAlg INFO Average number of electrons (rem): 2.04556 MyTestZeeAlg INFO *******************</verbatim> And check the contents if it has correctly values "myEl_" and "my_mZ" ---+ Another method of D3PD->D3PD !NTUPtoNTUPExample has another method to skim/slim !D3PD by using simple script. Run example: <pre>mkdir -p $TestArea/run/test5 cd $TestArea/run/test5 SkimNTUP_trf.py maxEvents=-1 inputNTUP_SMWZFile=%inputNTUP_SMWZFile% outputNTUP_MYSKIMNTUPFile=myTestNtup.root >log 2>&1</pre> The script file for skimming/slimming is !PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/python/slim.py, which has !PyROOT function, doSkim(). !JobOFragment file is !PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/python/MySkimNTUP_prodJobOFragment.py, which calls doSkim(). To add new type, add new definition to !PhysicsAnalysis/NTUPtoNTUP/NTUPtoNTUPExample/python/SkimNTUP_ProdFlags.py as described above for !NTUPtoNTUP. <!-- Add the main topic here. i.e. create some new headings as follows: --> <!-- *********************************************************** --> <!-- Do NOT remove the remaining lines, but add requested info as appropriate --> <!-- *********************************************************** --> --- <!-- For significant updates to the topic, consider adding your 'signature' (beneath this editing box) --> *Major updates*:%BR% -- Main.MichiruKaneda - 05-Apr-2012 <!-- Person responsible for the page: Either leave as is - the creator's name will be inserted; Or replace the complete REVINFO tag (including percentages symbols) with a name in the form Main.TwikiUsersName --> %RESPONSIBLE% %REVINFO{"$wikiusername" rev="1.1"}% %BR% <!-- Once this page has been reviewed, please add the name and the date e.g. Main.StephenHaywood - 31 Oct 2006 --> %REVIEW% *Never reviewed* %STOPINCLUDE%
E
dit
|
A
ttach
|
Watch
|
P
rint version
|
H
istory
: r2
<
r1
|
B
acklinks
|
V
iew topic
|
WYSIWYG
|
M
ore topic actions
Topic revision: r2 - 2012-08-13
-
MichiruKaneda
Log In
Main
Home
Index
Search
User Search
Changes
Notifications
RSS Feed
Documentation
Support
Webs
Main
Main Archive
Plugins
Sandbox for tests
Public webs
Public webs
ABATBEA
ACPP
ADCgroup
AEGIS
AfricaMap
AgileInfrastructure
ALICE
AliceEbyE
AliceSPD
AliceSSD
AliceTOF
AliFemto
ALPHA
ArdaGrid
ASACUSA
AthenaFCalTBAna
Atlas
AtlasLBNL
AXIALPET
CAE
CALICE
CDS
CENF
CERNSearch
CLIC
Cloud
CloudServices
CMS
Controls
CTA
CvmFS
DB
DefaultWeb
DESgroup
DPHEP
DM-LHC
DSSGroup
EGEE
EgeePtf
ELFms
EMI
ETICS
FIOgroup
FlukaTeam
Frontier
Gaudi
GeneratorServices
GuidesInfo
HardwareLabs
HCC
HEPIX
ILCBDSColl
ILCTPC
IMWG
Inspire
IPv6
IT
ItCommTeam
ITCoord
ITdeptTechForum
ITDRP
ITGT
ITSDC
LAr
LCG
LCGAAWorkbook
Leade
LHCAccess
LHCAtHome
LHCb
LHCgas
LHCONE
LHCOPN
LinuxSupport
Main
Medipix
Messaging
MPGD
NA49
NA61
NA62
NTOF
Openlab
PDBService
Persistency
PESgroup
Plugins
PSAccess
PSBUpgrade
R2Eproject
RCTF
RD42
RFCond12
RFLowLevel
ROXIE
Sandbox
SocialActivities
SPI
SRMDev
SSM
Student
SuperComputing
Support
SwfCatalogue
TMVA
TOTEM
TWiki
UNOSAT
Virtualization
VOBox
WITCH
XTCA
Welcome Guest
Login
or
Register
Cern Search
TWiki Search
Google Search
Main
All webs
Copyright &© 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki?
Send feedback