-- SouravSen - 2018-09-27

CTIDE NN speedup work log

Introduction

CTIDE NN speedup project's progress to be kept track here

Pending action items

  • call sequence of posNN

Current status

  • call sequence for posNN

Done bucket
  • cross check clock method with trigger tool for time measurement ( gitlabrepo)
  • Test run with just SiClusterizationTool sparse checkout
    • try in /tmp (currently on lxplus072)
pwd/tmp/LOCALNN
setupATLAS
lsetup git
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git

#Adding packages SiClusterizationTool 
cd athena
git atlas addpkg SiClusterizationTool

#start a dev branch from 21.0
git fetch upstream
git checkout -b CTIDENNtiming upstream/21.0 --no-track

#build (need to rebuild when this whole sparse checkout is moved to /eos)
cd ..
mkdir build && cd build
asetup Athena, 21.0.82 #latest

cmake ../athena/Projects/WorkDir/
make

source */setup.sh

#test run for now in /tmp (rerun when moved to /eos)
cd ..
mkdir run_test && cd run_test
Reco_tf.py --AMI q43




  • Created CTIDE NN complete ROOT file + local COOL db
https://github.com/SuperSourav/CTIDE_NN_local_insert
  • clock method of ctime library implemeted by putting numNN call in a loop of 10000 and measuring that time ~14microsec/numNN call
    #inside NnPixelClusterSplitProbTool.cxx
    26 #include <time.h>
    ..
    143 clock_t t2;
    144 t2 = clock();
    145 //timerA->start(); ///////////
    146 //std::vector<double> vectorOfProbs=m_NnClusterizationFactory->estimateNumberOfParticles(origCluster, trackParameters.associatedSurface (), trackParameters);
    147 std::vector<double> vectorOfProbs;
    148 int Nrepeat = 10000;
    149 for (int repeat=0; repeat < Nrepeat; repeat++){
    150 vectorOfProbs=m_NnClusterizationFactory->estimateNumberOfParticles(origCluster, trackParameters.associatedSurface(), trackParameters) ;
    151 }
    152 //timerA->stop(); /////////////
    153 t2 = clock() - t2;
    154 std::cout << "~~~~~~~~~~~~~~~~~~~~~~~CLOCK~~~~~~~~~~~>" << Nrepeat << " numNN call (w/) trk info: " << ((float)t2)/(CLOCKS_PER_SEC) << " s" << std::endl;

  • Run the local CTIDE NN on athena Recotf chain
    cd /eos/user/s/sosen/LOCALTESTATHENA/localNN

    #currently going to use the dummy CTIDE NN (newNN.root) which was added to the newpixelNNdb.db in the Recotf chain

    #creating the jO for overriding the the CTIDE NN COOL dB with the local CTIDE NN db (newpixelNNdb.db)

    cat CTIDENN_IDrecon_preInclude.py #the CTIDE NN overriding jO

    from IOVDbSvc.CondDB import conddb;
    conddb.setGlobalTag(globalflags.ConditionsTag());
    # Updated NN folder
    conddb.blockFolder("/PIXEL/PixelClustering/PixelClusNNCalib")
    conddb.addFolder("","<dbConnection>sqlite://;schema=newpixelNNdb.db;dbname=OFLP200</dbConnection> /PIXEL/PixelClustering/PixelClusNNCalib <tag>PixClusNNCalib-SuperS10122018</tag>", force=True);
    ..

    #Now creating the submission script which preIncludes the above jO- "CTIDENN_IDrecon_preInclude.py" in the Athena Recotf chain

    #Adding a preInclude to the __command_localtestWithPerfMonSD.sh which is the submission script with tag- q221 I was running with the input file and local InDetRecTools sparse build

    cd /afs/cern.ch/work/s/sosen/LOCALINDET_BUILD/

    mkdir runWithPerfMon100eventsRerunZprime_localNN

    cd runWithPerfMon100eventsRerunZprime_localNN

    #softlinking the local CTIDE NN ROOT file and the local COOL DB and the jO- CTIDENN_IDrecon_preInclude.py
    ln -s /eos/user/s/sosen/LOCALTESTATHENA/localNN/newNN.root .
    ln -s /eos/user/s/sosen/LOCALTESTATHENA/localNN/newpixelNNdb.db .
    ln -s /eos/user/s/sosen/LOCALTESTATHENA/localNN/PoolFileCatalog.xml .
    ln -s /eos/user/s/sosen/LOCALTESTATHENA/localNN/CTIDENN_IDrecon_preInclude.py .
    ln -s /eos/user/s/sosen/LOCALTESTATHENA/localNN/PoolFileCatalog.xml .

    cp ../__command_localtestWithPerfMonSD.sh .

    #Add preInclude="CTIDENN_IDrecon_preInclude.py" \ in L4 in __command_localtestWithPerfMonSD.sh
    cat __command_localtestWithPerfMonSD.sh

    #!/bin/bash
    Reco_tf.py \
    --AMI q221 \
    --preInclude="CTIDENN_IDrecon_preInclude.py" \
    --inputRDOFile /eos/user/s/sosen/RDO_CTIDE_timing/user.rjansky.mc16_13TeV.427080.Pythia8EvtGen_A14NNPDF23LO_flatpT_Zprime.RDO_20180110_EXT0/user.rjansky.12944101.EXT0._005040.RDO.root \
    --maxEvents 10 \
    --preExec='from PerfMonComps.PerfMonFlags import jobproperties as pmjp;pmjp.PerfMonFlags.doPostProcessing=True;pmjp.PerfMonFlags.doSemiDetailedMonitoringFullPrint=True' \
    --outputAODFile myAOD.root 2>&1 | tee _log_localtestWithPerfMonSD_q221.txt;

    #Now run for test

  • Write a script to convert h5 trained network into actual root files in the COOL DB
pwd
/afs/cern.ch/work/s/sosen/public/Qualification_Task/ZONE/CTIDE_NN_framework/neuralNets/8bitTOT

python h52root.py #no setupATLAS and lsetup ROOT required

#copied the same script in /eos/user/s/sosen/LOCALTESTATHENA/localNN and uploaded to https://github.com/SuperSourav/CTIDE_NN_local_insert/blob/master/h52root.py
  • Alex's twiki needs to be reproduced for updating with the local NN ( twiki) (developing gitrepo)
cd /eos/user/s/sosen/LOCALTESTATHENA/localNN

#copy a legit conditional db (COOL db) trained CTIDE NN
cp /afs/cern.ch/atlas/conditions/poolcond/vol0/cond09_mc.000087.gen.COND/cond09_mc.000087.gen.COND._0004.pool.root .

#initializing github repo
setupATLAS
lsetup git
git init
echo "# Creating the DB file with the local CTIDE neural network" >> README.md
git add README.md
git commit -m "first commit"
git remote add origin https://github.com/SuperSourav/CTIDE_NN_local_insert.git
git push -u origin master

#get and run the copier.py script to create dummy local CTIDE NN from cond db CTIDE NN by just removing (not copying) the db id
cp ../newNetwork/copier.py .
python copier.py

#check if the local CTIDE NN has the correct branches
asetup Athena, 21.0.34
python
Python 2.7.13 (default, Apr 22 2017, 20:06:00) 
[GCC 6.2.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import ROOT
Warning in <TInterpreter::ReadRootmapFile>: class  UCharDbArray found in libRootCnvDict.so  is already in libStorageSvcDict.so 
>>> f = ROOT.TFile("newNN.root")
>>> f.ls()
TFile**        newNN.root    
 TFile*        newNN.root    
  KEY: TDirectoryFile    NumberParticles;1    NumberParticles
  KEY: TDirectoryFile    ImpactPoints1P;1    ImpactPoints1P
  KEY: TDirectoryFile    ImpactPoints2P;1    ImpactPoints2P
  KEY: TDirectoryFile    ImpactPoints3P;1    ImpactPoints3P
  KEY: TDirectoryFile    ImpactPointErrorsX1;1    ImpactPointErrorsX1
  KEY: TDirectoryFile    ImpactPointErrorsX2;1    ImpactPointErrorsX2
  KEY: TDirectoryFile    ImpactPointErrorsX3;1    ImpactPointErrorsX3
  KEY: TDirectoryFile    ImpactPointErrorsY1;1    ImpactPointErrorsY1
  KEY: TDirectoryFile    ImpactPointErrorsY2;1    ImpactPointErrorsY2
  KEY: TDirectoryFile    ImpactPointErrorsY3;1    ImpactPointErrorsY3
>>> 

# add the copier script to git
git add copier.py
git commit -m "script to copy CTIDE NN branch minus the GUID to create dummy local CTIDE NN"
git push -u origin master

#adding a GUID (cond db id) to the dummy CTIDE NN (GUID as a 'ROOT.TObjString')
coolHist_setFileIdentifier.sh newNN.root
Generated GUID is 5B2B52B4-8392-4B79-99C2-337C8E2CC51D
Warning in <TInterpreter::ReadRootmapFile>: class  UCharDbArray found in libRootCnvDict.so  is already in libStorageSvcDict.so 
   ------------------------------------------------------------
  | Welcome to ROOT 6.08/06                http://root.cern.ch |
  |                               (c) 1995-2016, The ROOT Team |
  | Built for linuxx8664gcc                                    |
  | From tag v6-08-06, 2 March 2017                            |
  | Try '.help', '.demo', '.license', '.credits', '.quit'/'.q' |
   ------------------------------------------------------------

root [0] 
Processing /tmp/coolHist_setFileIdentifier_4598.C("newNN.root","5B2B52B4-8392-4B79-99C2-337C8E2CC51D")...
Record GUID 5B2B52B4-8392-4B79-99C2-337C8E2CC51D in file newNN.root
TFile**        newNN.root    
 TFile*        newNN.root    
  KEY: TDirectoryFile    NumberParticles;1    NumberParticles
  KEY: TDirectoryFile    ImpactPoints1P;1    ImpactPoints1P
  KEY: TDirectoryFile    ImpactPoints2P;1    ImpactPoints2P
  KEY: TDirectoryFile    ImpactPoints3P;1    ImpactPoints3P
  KEY: TDirectoryFile    ImpactPointErrorsX1;1    ImpactPointErrorsX1
  KEY: TDirectoryFile    ImpactPointErrorsX2;1    ImpactPointErrorsX2
  KEY: TDirectoryFile    ImpactPointErrorsX3;1    ImpactPointErrorsX3
  KEY: TDirectoryFile    ImpactPointErrorsY1;1    ImpactPointErrorsY1
  KEY: TDirectoryFile    ImpactPointErrorsY2;1    ImpactPointErrorsY2
  KEY: TDirectoryFile    ImpactPointErrorsY3;1    ImpactPointErrorsY3
  KEY: TObjString    fileGUID;1    object title

#make a local copy of the POOL catalog
cp /cvmfs/atlas-condb.cern.ch/repo/conditions/poolcond/PoolFileCatalog.xml .

#the original CTIDE NN file is in the POOL catalog (just sanity check that this is a correct POOL catalog)
grep cond09_mc.000087.gen.COND._0004.pool.root PoolFileCatalog.xml
    <pfn filetype="ROOT_All" name="/cvmfs/atlas-condb.cern.ch/repo/conditions/cond09/cond09_mc.000087.gen.COND/cond09_mc.000087.gen.COND._0004.pool.root"/>
    <lfn name="cond09_mc.000087.gen.COND._0004.pool.root"/>


# Insert the file into the POOL catalogue
coolHist_insertFileToCatalog.py newNN.root
#check if inserted
grep newNN.root PoolFileCatalog.xml
      <pfn filetype="ROOT_All" name="newNN.root"/>
#physical file name (pfn) added to the catalog, but logical file name (lfn) not there yet! moving on

#PATH VARIABLE set to CERN POOL data by Athena from where the POOL catalog was copied here
echo $ATLAS_POOLCOND_PATH
/cvmfs/atlas-condb.cern.ch/repo/conditions

#Now as we are going to work with a local POOL catalog, reset the PATH VARIABLE set to CERN POOL data to the $PWD
export ATLAS_POOLCOND_PATH=$PWD
#check if the path set correctly
echo $ATLAS_POOLCOND_PATH
/eos/user/s/sosen/LOCALTESTATHENA/localNN

#create new local conditions DB (use AtlCoolCopy instead of AtlCoolCopy.exe (obsolete))
$ AtlCoolCopy "COOLOFL_PIXEL/OFLP200" "sqlite://X;schema=newpixelNNdb.db;dbname=OFLP200" -f /PIXEL/PixelClustering/PixelClusNNCalib -nd -rdo -c
Using machine hostname lxplus062.cern.ch for DB replica resolution
Frontier server at (serverurl=http://atlasfrontier-local.cern.ch:8000/atlr)(serverurl=http://atlasfrontier-ai.cern.ch:8000/atlr)(serverurl=http://lcgft-atlas.gridpp.rl.ac.uk:3128/frontierATLAS)(serverurl=http://ccfrontier.in2p3.fr:23128/ccin2p3-AtlasFrontier)(proxyurl=http://ca-proxy.cern.ch:3128)(proxyurl=http://ca-proxy-meyrin.cern.ch:3128)(proxyurl=http://ca-proxy-wigner.cern.ch:3128)(proxyurl=http://atlasbpfrontier.cern.ch:3127)(proxyurl=http://atlasbpfrontier.fnal.gov:3127) will be considered
Total of 10 servers found for host lxplus062.cern.ch
Open source database: COOLOFL_PIXEL/OFLP200
Allowed replica to try (priority -700) : frontier://ATLF/()/ATLAS_COOLOFL_PIXEL
Allowed replica to try (priority -699) : oracle://ATLAS_COOLPROD/ATLAS_COOLOFL_PIXEL
Allowed replica to try (priority -200) : frontier://ATLF/()/ATLAS_COOLOFL_PIXEL
Open destination database: sqlite://X;schema=newpixelNNdb.db;dbname=OFLP200
COOL exception caught: The database does not exist
Try to create new conditions DB
Creation succeeded
Add folders in path:/PIXEL/PixelClustering/PixelClusNNCalib [ /PIXEL/PixelClustering/PixelClusNNCalib ]
Creating folder /PIXEL/PixelClustering/PixelClusNNCalib payload-type 0 on destination
Created 1 new channels for /PIXEL/PixelClustering/PixelClusNNCalib

# Write data on local conditional db
$ coolHist_setReference.py 'sqlite://X;schema=newpixelNNdb.db;dbname=OFLP200' /PIXEL/PixelClustering/PixelClusNNCalib 1 PixClusNNCalib-XX-YY-ZZ newNN.root 
Warning in <TInterpreter::ReadRootmapFile>: class  UCharDbArray found in libRootCnvDict.so  is already in libStorageSvcDict.so 
>== Data valid for run,LB [ 0 , 0 ] to [ 2147483647 , 4294967294 ]
>== Inserting reference to file: newNN.root  - find GUID
Warning in <TInterpreter::ReadRootmapFile>: class  UCharDbArray found in libRootCnvDict.so  is already in libStorageSvcDict.so 
   ------------------------------------------------------------
  | Welcome to ROOT 6.08/06                http://root.cern.ch |
  |                               (c) 1995-2016, The ROOT Team |
  | Built for linuxx8664gcc                                    |
  | From tag v6-08-06, 2 March 2017                            |
  | Try '.help', '.demo', '.license', '.credits', '.quit'/'.q' |
   ------------------------------------------------------------

root [0] 
Processing /tmp/coolHist_extractFileIdentifier_26165.C("newNN.root")...
Get GUID from file newNN.root
GUID is 5B2B52B4-8392-4B79-99C2-337C8E2CC51D

>== Write data on COOL connection: sqlite://X;schema=newpixelNNdb.db;dbname=OFLP200
>== To folder: /PIXEL/PixelClustering/PixelClusNNCalib channel: 1
>== COOL tag: PixClusNNCalib-XX-YY-ZZ
>== Store object with IOV [ 0 , 9223372036854775807 ] channel 1 and tag PixClusNNCalib-XX-YY-ZZ




Completed tasks

  • Only finished the standalone task for the CTIDE NN timing study with smaller cluster sizes ( slides)
  • Used PerfMonSD to get the total time of the ambiguity solver algorithm ( slides)
  • local modifications to CTIDE NN in offline tracking chain ( slides)
  • My latest CTIDE presentation ( slides)
<!-- * My latest CTIDE presentation ( slides) -->

Useful links

Misc links

  • stanford ATLAS computing tricks ( link)

Junk

#the athena sparse checkout in the /eos/user/s/sosen/LOCALINDET/ has been replaced with the athena sparse checkout with only SiClusterizationTool
$ pwd
/eos/user/s/sosen/LOCALINDET/

#moved athena sparse checkout of InDetExample and InDetRecTools
pwd
/eos/user/s/sosen/LOCALINDET

#Adding TrigTimer in NnPixelClusterSplitProbTool
cd athena/InnerDetector/InDetRecTools/SiClusterizationTool/

cd SiClusterizationTool/


cd ../src
cp NnPixelClusterSplitProbTool.cxx NnPixelClusterSplitProbTool.cxx.bak
vim NnPixelClusterSplitProbTool.cxx
#inlclude trigtimeAlgs
23 //trigtimer
24 #include "TrigTimeAlgs/TrigTimer.h"
25 //

#initialize trigtimer tool to nullptr (L35)
30 NnPixelClusterSplitProbTool::NnPixelClusterSplitProbTool(const std::string& t, const std::string& n, cons t IInterface* p)
31 :AthAlgTool(t,n,p),
32 m_NnClusterizationFactory("InDet::NnClusterizationFactory/NnClusterizationFactory"),
33 m_iBeamCondSvc("BeamCondSvc",n),
34 m_useBeamSpotInfo(true),
35 m_numNNTimer(nullptr)


#initialize
53 StatusCode NnPixelClusterSplitProbTool::initialize()
54 {
55
56
57 // trigtimer
58 m_numNNTimer = addTimer("numNNTimer");
59 //

#inside InDet::PixelClusterSplitProb NnPixelClusterSplitProbTool::splitProbability(const InDet::PixelCluster& origCluster ) start and stop the trigtimer
#before and after the CTIDE NN call resp. and print out
90 //starting the trigtimer
91 m_numNNTimer->start();
92 std::vector<double> vectorOfProbs=m_NnClusterizationFactory->estimateNumberOfParticles(origCluster,beamSpotPosition);
93 //stopping the trigtimer
94 m_numNNTimer->stop();
95 //printing out the time
96 std::cout << m_numNNTimer->elapsed() << " ms -> NumNN call (w/o trackinfo) *******************TRIGTIMER" << std::endl;

#similarly inside InDet::PixelClusterSplitProb NnPixelClusterSplitProbTool::splitProbability(const InDet::PixelCluster& origCluster, const Trk::TrackParameters& trackParameters )
129 //starting the trigtimer
130 m_numNNTimer->start();
131 std::vector<double> vectorOfProbs=m_NnClusterizationFactory->estimateNumberOfParticles(origCluster, trackParameters.associatedSurface(), trackParameters);
132 //stopping the trigtimer
133 m_numNNTimer->stop();
134 //printing out the time
135 std::cout << m_numNNTimer->elapsed() << " ms -> NumNN call (w/ trackinfo) *******************TRIGTIMER" << std::endl;

  • Also the LOCALATHENA2 does not have the config InDetExample package (need to add it) Redoing from scratch. (following athena sparse checkout to add TrigTimer ( link) (latest success slides see pg 4))
FAILED ATTEMPT:

# for some reason athena sparse checkout works well in the local memory not afs or eos. So doing it in the /tmp of lxplus027
[sosen@lxplus027 LOCALINDET]$ pwd
/tmp/LOCALINDET
setupATLAS
lsetup git
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git

#Adding packages InDetExample and InDetRecTools
cd athena
git atlas addpkg InDetExample
git atlas addpkg InDetRecTools
[sosen@lxplus027 athena]$ ls
InnerDetector Projects
[sosen@lxplus027 athena]$ ls InnerDetector/
InDetExample InDetRecTools


#start a dev branch from 21.0
git fetch upstream
git checkout -b CTIDENNtiming upstream/21.0 --no-track


#build (need to rebuild when this whole sparse checkout is moved to /eos)
cd ..
mkdir build && cd build
asetup Athena, 21.0.34 #21.0.82 (latest)
cmake ../athena/Projects/WorkDir/
make -j9
source */setup.sh

#test run for now in /tmp (rerun when moved to /eos)
cd ..
mkdir run_test && cd run_test
Reco_tf.py --AMI q431
#######RUNTIME ERROR MESSAGE:
04:46:30 ************************************************************************************
04:46:30 Py:Athena INFO including file "InDetRecExample/InDetRecCaloSeededROISelection.py"
04:46:32 Shortened traceback (most recent user call last):
04:46:32 File "/cvmfs/atlas.cern.ch/repo/sw/software/21.0/Athena/21.0.34/InstallArea/x86_64-slc6-gcc62-opt/jobOptions/RecJobTransforms/skeleton.RAWtoESD_tf.py", line 199, in <module>
04:46:32 else: include( "RecExCommon/RecExCommon_topOptions.py" )
04:46:32 File "/cvmfs/atlas.cern.ch/repo/sw/software/21.0/Athena/21.0.34/InstallArea/x86_64-slc6-gcc62-opt/jobOptions/RecExCommon/RecExCommon_topOptions.py", line 695, in <module>
04:46:32 include ("RecExCommon/SystemRec_config.py")
04:46:32 File "/cvmfs/atlas.cern.ch/repo/sw/software/21.0/Athena/21.0.34/InstallArea/x86_64-slc6-gcc62-opt/jobOptions/RecExCommon/SystemRec_config.py", line 25, in <module>
04:46:32 protectedInclude( "InDetRecExample/InDetRec_jobOptions.py" )
04:46:32 File "/tmp/LOCALINDET/build/x86_64-slc6-gcc62-opt/jobOptions/InDetRecExample/InDetRec_jobOptions.py", line 153, in <module>
04:46:32 if not rec.doAODMerging():
04:46:32 AttributeError: JobPropertyContainer:: JobProperties.Rec does not have property doAODMerging
04:46:32 Py:Athena INFO leaving with code 8: "an unknown exception occurred"

to fix the above problem (trying latest athena, since there are in order of 1000 changes from 35 to 82 (latest atm)

FAILED ATTEMPT:

[sosen@lxplus027 athena]$ ls InnerDetector/
InDetExample InDetRecTools
[sosen@lxplus027 athena]$ git pull origin 21.0
From https://gitlab.cern.ch:8443/sosen/athena
* branch 21.0 -> FETCH_HEAD
Already up-to-date.

#build (need to rebuild when this whole sparse checkout is moved to /eos)
cd ..
mkdir build && cd build

[sosen@lxplus027 build]$ asetup Athena, 21.0.82 #(latest), perhaps can use Athena, 21.0, latest in future
AtlasSetup(ERROR): No release candidates found in:
/cvmfs/atlas.cern.ch/repo/sw/software/21.0/Athena
/cvmfs/atlas.cern.ch/repo/sw/software/21.0/AtlasOffline
/cvmfs/atlas.cern.ch/repo/sw/software/21.0/AtlasProduction
#perhaps 21.0.82 not yet out, so using latest

asetup Athena, 21.0, latest #it is 82 but the above command doesn't work some reason, moving on


cmake ../athena/Projects/WorkDir/
make -j9
######COMPILATION ERROR:
...
[ 50%] Generating ../../../x86_64-slc6-gcc62-opt/python/InDetSLHC_Example/SLHC_Setup_InclBrl.pyc
/bin/sh: line 1: 27640 Segmentation fault /cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase/x86_64/Cmake/3.11.0/Linux-x86_64/bin/cmake -E create_symlink ../../../athena/InnerDetector/InDetExample/InDetSLHC_Example/data/slhcsct_local_database_SLHC-06-03.txt /tmp/LOCALINDET/BAK/build/x86_64-slc6-gcc62-opt/share/slhcsct_local_database_SLHC-06-03.txt
make[2]: *** [x86_64-slc6-gcc62-opt/share/slhcsct_local_database_SLHC-06-03.txt] Error 139
make[2]: *** Waiting for unfinished jobs....
[ 50%] Generating ../../../x86_64-slc6-gcc62-opt/jobOptions/InDetRecExample/InDetRecLoadTools.py
[ 50%] Generating ../../../x86_64-slc6-gcc62-opt/jobOptions/InDetRecExample/InDetRecNtupleCreation.py
[ 50%] Generating ../../../x86_64-slc6-gcc62-opt/jobOptions/InDetAlignExample/inputData_minbias.txt
make[1]: *** [InnerDetector/InDetExample/InDetSLHC_Example/CMakeFiles/InDetSLHC_ExampleRuntimeInstall.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
[ 50%] Generating ../../../x86_64-slc6-gcc62-opt/python/InDetSLHC_Example/SLHC_Setup_InnerInclined.pyc
...
[ 63%] Generating ../../../x86_64-slc6-gcc62-opt/jobOptions/InDetSLHC_Example/test_G4AtlasGeo_SLHC_test.py
[ 63%] Built target InDetSLHC_ExampleJobOptInstall
[ 63%] Built target InDetSLHC_ExampleXmlInstall
make: *** [all] Error 2




##with a fresh shell get the following compilation error:
[ 48%] Generating ../../../x86_64-slc6-gcc62-opt/python/InDetTrigRecExample/InDetTrigCommonTools.py
make[2]: *** [x86_64-slc6-gcc62-opt/share/jobOptions_RTT_InDetTrigRecExample_backTracking.py] Segmentation fault
make[1]: *** [InnerDetector/InDetExample/InDetTrigRecExample/CMakeFiles/InDetTrigRecExampleRuntimeInstall.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
[ 48%] Generating ../../../x86_64-slc6-gcc62-opt/python/InDetSLHC_Example/SLHC_Setup_InnerInclined.pyc
...
[ 77%] Built target InDetSLHC_ExampleRuntimeInstall
make: *** [all] Error 2

##with a fresh shell 
make #without -j9 compiles => some race condition while compilation, maybe -j4 would work fine, check later
source */setup.sh

Reco_tf.py --AMI q431

Trying without InDetExample:


[sosen@lxplus027 LOCALINDET]$ pwd
/tmp/LOCALINDET

mv athena/ athena_bak/
mkdir athena && cd athena
lsetup git
git atlas init-workdir https://:@gitlab.cern.ch:8443/atlas/athena.git
 
#Adding packages InDetRecTools
cd athena
git atlas addpkg InDetRecTools
[sosen@lxplus027 athena]$ ls InnerDetector/
InDetRecTools

#start a dev branch from 21.0
git fetch upstream
git checkout -b CTIDENNtiming upstream/21.0 --no-track
#build (need to rebuild when this whole sparse checkout is moved to /eos)
cd ..
mkdir build && cd build
asetup Athena, 21.0.34
cmake ../athena/Projects/WorkDir/
make -j9
source */setup.sh
#test run for now in /tmp (rerun when moved to /eos)
cd ..
mkdir run_test2 && cd run_test2
Reco_tf.py --AMI q431
Edit | Attach | Watch | Print version | History: r19 < r18 < r17 < r16 < r15 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r19 - 2019-01-07 - SouravSen
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback