Cosmic Simulation First Production

Details

  • After much discussion and some fixes and validation we are launching the cosmic sim production on April 16 2009
  • For this simulation and digitization will be done with release 14.5.1.4,AtlasProduction
  • Reconstruction will be done with a release like 14.5.2.10,AtlasProduction (exact release still needs to be built) to be as consistent with the data reprocessing
  • The transform commands that will be run are similar to those described at CosmicSimHowTo
  • As discussed in data preparation the first production will produce the following statistics
1M Pixel Volume - both fields on
1M ID Volume - both fields on
0.5M Muon volume - both fields on
1M Pixel Volume - both fields off
1M ID Volume - both fields off
0.5M Muon volume - both fields off
  • So 5 Million events in all
  • The breakdown in terms of sim and reco jobs should be:
    • i) for pixel volume (w and w/o field)
input no of events/job in simulation = 25000
output no of events/job in simulation = 250
number of simulation jobs = 4000
total events out of simulation = 1M

input simulation jobs for 1 digitization job = 6
total number of digitization jobs = 667
output size of digitization (RDO) files ~2.1 GB
    • ii) ID Volume (w and w/o field)
input no of events/job in simulation = 50000
output no of events/job in simulation = 180
number of simulation jobs = 5556
total events out of simulation = 1M

input simulation jobs for 1 digitization job = 8
total number of digitization jobs = 695
output size of digitization (RDO) files ~2.1 GB
    • iii) Muon Volume (w and w/o field)
input no of events/job in simulation = 20000
output no of events/job in simulation = 1400
number of simulation jobs = 358
total events out of simulation = 0.5M

input simulation jobs for 1 digitization job = 1
total number of digitization jobs = 358
output size of digitization (RDO) files ~2.1 GB
  • these numbers have been decided for two reasons
    • i) simulation jobs dont take too long or use too much resources
    • ii) the output RDO files are ~2 GB in size
  • This gives us about 23000 jobs to run

Validation

  • Andreas Korn has been taking care of validation - please see his wiki here

Actual commands submitted


SIMULATION TASKS:
------------------------------
i) pixel volume (both fields on)

csc_cosmics_sim_trf.py outputHitsFile=???? maxEvents=25000 runNumber=??? firstEvent=??? randomSeed=??? geometryVersion=ATLAS-GEO-07-00-00 CosmicFilterVolume=TRT_Barrel CosmicFilterVolume2=Pixel jobConfig=Cosmics.py 
where you fill in the ???'s on the per job basis (except runnumber which you set to the same thing for all jobs in this task).
firstEvent should be n*25000 where n=the job number in this task.

number of jobs in this task should be: 4000

ii) pixel volume (both fields off)

csc_cosmics_sim_trf.py outputHitsFile=???? maxEvents=25000 runNumber=??? firstEvent=??? randomSeed=??? geometryVersion=ATLAS-GEONF-07-00-00 CosmicFilterVolume=TRT_Barrel CosmicFilterVolume2=Pixel jobConfig=Cosmics.py 
where you fill in the ???'s on the per job basis (except runnumber which you set to the same thing for all jobs in this task).
firstEvent should be n*25000 where n=the job number in this task.

number of jobs in this task should be: 4000

(everything the same as i) except that geometryVersion=ATLAS-GEONF-07-00-00 ).

iii) ID volume (both fields on)
csc_cosmics_sim_trf.py outputHitsFile=???? maxEvents=50000 runNumber=??? firstEvent=??? randomSeed=??? geometryVersion=ATLAS-GEO-07-00-00  CosmicFilterVolume=TRT_Barrel CosmicFilterVolume2=TRT_EC  jobConfig=Cosmics.py 
where you fill in the ???'s on the per job basis (except runnumber which you set to the same thing for all jobs in this task).
firstEvent should be n*50000 where n=the job number in this task.

number of jobs in this task should be: 5556 

iv) ID volume (both fields off)
csc_cosmics_sim_trf.py outputHitsFile=???? maxEvents=50000 runNumber=??? firstEvent=??? randomSeed=??? geometryVersion=ATLAS-GEONF-07-00-00  CosmicFilterVolume=TRT_Barrel CosmicFilterVolume2=TRT_EC  jobConfig=Cosmics.py 
where you fill in the ???'s on the per job basis (except runnumber which you set to the same thing for all jobs in this task).
firstEvent should be n*50000 where n=the job number in this task.

number of jobs in this task should be: 5556 
(everything the same as iii) except that geometryVersion=ATLAS-GEONF-07-00-00 ).

v) Muon volume (both fields on)
csc_cosmics_sim_trf.py outputHitsFile=???? maxEvents=20000 runNumber=??? firstEvent=??? randomSeed=??? geometryVersion=ATLAS-GEO-07-00-00  CosmicFilterVolume=Muon  jobConfig=Cosmics.py 
where you fill in the ???'s on the per job basis (except runnumber which you set to the same thing for all jobs in this task).
firstEvent should be n*20000 where n=the job number in this task.

number of jobs in this task should be: 358

 vi) Muon volume (both fields off)
csc_cosmics_sim_trf.py outputHitsFile=???? maxEvents=20000 runNumber=??? firstEvent=??? randomSeed=??? geometryVersion=ATLAS-GEONF-07-00-00  CosmicFilterVolume=Muon  jobConfig=Cosmics.py 
where you fill in the ???'s on the per job basis (except runnumber which you set to the same thing for all jobs in this task).
firstEvent should be n*20000 where n=the job number in this task.

number of jobs in this task should be: 358
(everything the same as v) except that geometryVersion=ATLAS-GEONF-07-00-00 ).


DIGITIZATION TASKS
------------------------------
i) pixel volume (both fields on)
csc_digi_trf.py inputHitsFile=????1.root,????2.root outputRDOFile=???? maxEvents=-1 skipEvents=0 geometryVersion=ATLAS-GEO-07-00-00 jobConfig=SimuJobTransforms/CosmicsDigitConfig.py digiSeedOffset1=0 digiSeedOffset2=0
where you fill in the ???. the input here for each job should be 6 files as output from simulation task i) (comma separated list of files as inputHitsFile= argument).

number of jobs in this task = 667.

ii) pixel volume (both fields off)
csc_digi_trf.py inputHitsFile=????1.root,????2.root outputRDOFile=???? maxEvents=-1 skipEvents=0 geometryVersion=ATLAS-GEONF-07-00-00 jobConfig=SimuJobTransforms/CosmicsDigitConfig.py digiSeedOffset1=0 digiSeedOffset2=0
where you fill in the ???. the input here for each job should be 6 files as output from simulation task ii) (comma separated list of files as inputHitsFile= argument).

number of jobs in this task = 667.

(everything the same as i) except that geometryVersion=ATLAS-GEONF-07-00-00 ).

iii) ID volume (both fields on)
csc_digi_trf.py inputHitsFile=????1.root,????2.root outputRDOFile=???? maxEvents=-1 skipEvents=0 geometryVersion=ATLAS-GEO-07-00-00 jobConfig=SimuJobTransforms/CosmicsDigitConfig.py digiSeedOffset1=0 digiSeedOffset2=0
where you fill in the ???. the input here for each job should be 8 files as output from simulation task iii) (comma separated list of files as inputHitsFile= argument).

number of jobs in this task = 695.

iv) ID volume (both fields off)
csc_digi_trf.py inputHitsFile=????1.root,????2.root outputRDOFile=???? maxEvents=-1 skipEvents=0 geometryVersion=ATLAS-GEONF-07-00-00 jobConfig=SimuJobTransforms/CosmicsDigitConfig.py digiSeedOffset1=0 digiSeedOffset2=0
where you fill in the ???. the input here for each job should be 8 files as output from simulation task iv) (comma separated list of files as inputHitsFile= argument).

number of jobs in this task = 695.

(everything the same as iii) except that geometryVersion=ATLAS-GEONF-07-00-00 ).

v) Muon volume (both fields on)
csc_digi_trf.py inputHitsFile=????1.root  outputRDOFile=???? maxEvents=-1 skipEvents=0 geometryVersion=ATLAS-GEO-07-00-00 jobConfig=SimuJobTransforms/CosmicsDigitConfig.py digiSeedOffset1=0 digiSeedOffset2=0
where you fill in the ???. the input here for each job should be 1 file as output from simulation task v) 

number of jobs in this task = 358.

vi) Muon volume (both fields off)
csc_digi_trf.py inputHitsFile=????1.root  outputRDOFile=???? maxEvents=-1 skipEvents=0 geometryVersion=ATLAS-GEONF-07-00-00 jobConfig=SimuJobTransforms/CosmicsDigitConfig.py digiSeedOffset1=0 digiSeedOffset2=0
where you fill in the ???. the input here for each job should be 1 file as output from simulation task vi) 

number of jobs in this task = 358.

(everything the same as v) except that geometryVersion=ATLAS-GEONF-07-00-00 ).

Submitted tasks

Simulation tasks

  • MC8.108867.CosSimIDVolSolOnTorOn.py
    • tag: s533 http://www-f9.ijs.si/atlpy/atlprod/prodtag/1893/
    • task: 59222
    • http://panda.cern.ch:25880/server/pandamon/query?mode=showtask0&reqid=59222
    • ~1882/5600 jobs done as of morning 22 April 2009 (bug 49362 submitted for this)
    • So this bug (which is understood and will be fixed with: CosmicGenerator-00-00-35 which will go into 14.5.1.5) was due to the fact that once the event number gets to: 100000000 the runnumber is incremented. because of the filtering we apply this happens for the IDVolume tasks and this causes problems. So we can use the 1/3 of the jobs that have eventnumber< 100000000 (should be 1/3 of a million events) but the rest will have to be simulated in 14.5.1.5 when its available.
    • 14.5.1.5 was built yesterday - but has a problem with it (to do with trigger menu xml files with the release name in them) - so we will need to wait for 14.5.1.6 which is being built now.
    • This task should have been aborted but seems to be still running on panda monitoring???

  • MC8.108866.CosSimIDVolSolOffTorOff.py
    • tag: s534 http://www-f9.ijs.si/atlpy/atlprod/prodtag/1894/
    • task: 59223
    • http://panda.cern.ch:25880/server/pandamon/query?mode=showtask0&reqid=59223
    • ~1618/5600 jobs done as of morning 22 April 2009 (bug 49362 submitted for this)
    • So this bug (which is understood and will be fixed with: CosmicGenerator-00-00-35 which will go into 14.5.1.5) was due to the fact that once the event number gets to: 100000000 the runnumber is incremented. because of the filtering we apply this happens for the IDVolume tasks and this causes problems. So we can use the 1/3 of the jobs that have eventnumber< 100000000 (should be 1/3 of a million events) but the rest will have to be simulated in 14.5.1.5 when its available.
    • 14.5.1.5 was built yesterday - but has a problem with it (to do with trigger menu xml files with the release name in them) - so we will need to wait for 14.5.1.6 which is being built now.
    • This task should have been aborted but seems to be still running on panda monitoring???

Newest Simulation tasks fro IDVolume only
  • These have been submitted as:
http://panda.cern.ch:25880/server/pandamon/query?mode=showtask0&reqid=65238 http://panda.cern.ch:25880/server/pandamon/query?mode=showtask0&reqid=65239
  • And go upto 1 million events for each (with rel. 14.5.1.6 which has the fix for bug 49362 in it

Digitization tasks

Failed digitization tasks
  • The tasks below were aborted due to the corrupt DBrelease used...

*Error details:* *exe:* TRF_UNKNOWN | resolveHVSTag | resolveHVSTag> |
resolveHVSTag> tag: OFLCOND-SIMC-00-00-00 does NOT exist | Tag name
GLOBAL cannot be resolved in folder /PIXEL/DCS/TEMPERATURE | Tag name
GLOBAL cannot be resolved in folder /PIXEL/DCS/FSMSTATE | Tag nam

New Digitization tasks

Reconstruction tasks

  • Reconstruction has not started running yet
  • It will use rel. 14.5.2.11,AtlasProduction which is being built now (28 Apr 2009)
  • It will use a command like:
Reco_trf.py beamType=cosmics  outputTAGComm=myTAGCOMM.root maxEvents=-1 HIST=myMergedMonitoring.root  preInclude=RecExCommon/RecoUsefulFlags.py,RecExCommission/CosmicSimulationRecoSetup.py,RecExCommission/RecExCommissionRepro.py,RecJobTransforms/UseOracle.py outputAODFile=myAOD.pool.root geometryVersion=ATLAS-GEO**-07-00-00 outputESDFile=myESD.pool.root inputRDOFile=* DBRelease=%DB=ddo.000001.Atlas.Ideal.DBRelease.v060801:DBRelease-6.8.1.tar.gz --ignoreunknown
  • (not sure if we should use ignoreunknown or ignoreall ??
  • Assorted details
where the output files:
myTAGCOMM.root
myMergedMonitoring.root
myAOD.pool.root 
myESD.pool.root
should of course have their filename depending on the input data, and should all be kept and registered with DDM (dq2).

the reco jobs should be run on the output RDO files from the tasks:
pixel vol field on: 60879 (should have: ATLAS-GEO-07-00-00 ).
pixel vol field off: 60880 (should have: ATLAS-GEONF-07-00-00 ).
ID vol field on: 60881 (should have: ATLAS-GEO-07-00-00 ).
ID vol field off: 60882 (should have: ATLAS-GEONF-07-00-00 ).
Muon vol field on: 60877 (should have: ATLAS-GEO-07-00-00 ).
Muon vol field off: 60878 (should have: ATLAS-GEONF-07-00-00 ).

so i think we need 2 reco configs one with ATLAS-GEO-07-00-00 (for tasks 60879, 60881, 60877) and one with ATLAS-GEONF-07-00-00 (tasks  60880, 60882, 60878) 

everything else except for input and output file names would be identical for the 2 configurations.

we should take 1 input RDO file for 1 job.

  • Iacopo has defined these 2 tasks as tags r666 r667
Submitted reco tasks
0xe5e31336 _ZNK4Muon12SortMCTBHitsclEPKNS_7MCTBHitES3_ + 0x36 [/data/esia/atlas/slc3/prod/releases/rel_14-13/AtlasProduction/14.5.2.11/InstallArea/i686-slc4-gcc34-opt/lib/libMuonCombiTrackMaker.so]
0xe5e256f3 _ZNK4Muon14MCTBHitHandler6insertERSt4listIPNS_7MCTBHitESaIS3_EERSt14_List_iteratorIS3_ES3_ + 0x43[/data/esia/atlas/slc3/prod/releases/rel_14-13/AtlasProduction/14.5.2.11/InstallArea/i686-slc4-gcc34-opt/lib/libMuonCombiTrackMaker.so]
0xe5e2a1fd _ZNK4Muon14MCTBHitHandler5mergeERKSt4listIPNS_7MCTBHitESaIS3_EERS5_ + 0x30d [/data/esia/atlas/slc3/prod/releases/rel_14-13/AtlasProduction/14.5.2.11/InstallArea/i686-slc4-gcc34-opt/lib/libMuonCombiTrackMaker.so]
0xe5e2b5a1 _ZNK4Muon14MCTBHitHandler5mergeERKSt4listIPNS_7MCTBHitESaIS3_EES7_RS5_ + 0x31 [/data/esia/atlas/slc3/prod/releases/rel_14-13/AtlasProduction/14.5.2.11/InstallArea/i686-slc4-gcc34-opt/lib/libMuonCombiTrackMaker.so]
0xe5e59bd5 _ZNK4Muon15MCTBTrackFitter11extractDataERKNS_18MCTBCandidateEntryES3_RNS0_10FitterDataE + 0x5c5[/data/esia/atlas/slc3/prod/releases/rel_14-13/AtlasProduction/14.5.2.11/InstallArea/i686-slc4-gcc34-opt/lib/libMuonCombiTrackMaker.so]
http://panda.cern.ch:25880/server/pandamon/query/?mode=taskquery&qTaskTRF=Reco_trf.py&qTaskName=CosSim&qStatus=LiveTasks&qTASEARCH=TASK&qsubmit=QuerySubmit
  • These have fixes for number of jobs per task and the output files are HIST. it still uses release 14.5.2.11 (so still have muon crash) we will re-run reco once that is fixed so this is kind of a test.
  • Andreas Korn has made the DQ monitoring plots web display for the 300K IDVolume events with field on (reference is an older smaller cosmic sim sample) - please see:
http://atlasdqm.web.cern.ch/atlasdqm/test/10/IDCosmic/run_108867/run/
  • Now these reco tasks are finished (with 14.5.2.12 so no MuonCombiTrackMaker bug)
  • The datasets can be downloaded from dq2 like:
dq2-get valid2.108866.CosSimIDVolSolOffTorOff.recon.ESD.s534_d168_r677_tid065082
dq2-get valid2.108867.CosSimIDVolSolOnTorOn.recon.ESD.s533_d167_r676_tid065083

dq2-get valid2.108868.CosSimMuonVolSolOffTorOff.recon.ESD.s536_d168_r677_tid065080
dq2-get valid2.108869.CosSimMuonVolSolOnTorOn.recon.ESD.s535_d167_r676_tid065081

dq2-get valid2.108864.CosSimPixVolSolOffTorOff.recon.ESD.s532_d168_r677_tid065079
dq2-get valid2.108865.CosSimPixVolSolOnTorOn.recon.ESD.s531_d167_r676_tid065078

-- JamieBoyd - 16 Apr 2009

Edit | Attach | Watch | Print version | History: r24 | r20 < r19 < r18 < r17 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r18 - 2009-05-11 - JamieBoyd
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback