Difference: CRABPrepareLocal (1 vs. 5)

Revision 52019-09-08 - StefanoBelforte

Line: 1 to 1
Changed:
<
<
META TOPICPARENT name="TWiki.WebPreferences"
>
>
META TOPICPARENT name="https://twiki.cern.ch/twiki/bin/view/CMSPublic/SWGuideCrab"
 

Submitting jobs to the CERN HTCondor pool

 This twiki explains how to use the preparelocal CRAB command to send jobs to the CERN condor pool (http://batchdocs.web.cern.ch/batchdocs/local/submit.html)
Added:
>
>

Preliminary setup

Once you have setup the environment, you need to create the CRAB project directory for your task. If you

  • have already submitted the task you can
    • simply cd to the project directory created at submission time
    • or create it with the crab remake command;
  • have not yet submitted the task, do it with the --dryrun option

Once the CRAB project directory is created, execute

crab preparelocal --dir = <PROJECTDIR>
 
Deleted:
<
<
cmsrel CMSSW_8_0_29 #CMSSW version taken from https://cmsweb.cern.ch/crabserver/ui
cd CMSSW_8_0_29
cmsenv
source /cvmfs/cms.cern.ch/crab3/crab.sh
mkdir ~/wf/test
cd ~/wf/test
crab remake --task=180522_171053:atittert_crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
crab preparelocal crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
cd crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2/local
mkdir out err log
 
Added:
>
>

HTCondor submission

Add #!/bin/bash as first line of the run_job.sh file in the <PROJECTDIR>/local directory.

Create a "batch" subdirectory in order to keep HTCondor files separated:

mkdir <PROJECTDIR>/local/batch
cd <PROJECTDIR>/local/batch
 
Changed:
<
<
Create the following condor JDL file (maybe it can be automated by the preparelocal?):
>
>
and place the following task.jdl file therein (maybe it can be automated by the preparelocal?):
 Universe = vanilla
Changed:
<
<
Executable = run_job.sh
>
>
Executable = ../run_job.sh
 Arguments = $(I)
Changed:
<
<
Log = log/job.$(Cluster).$(Process).log
>
>
Log = log/job.log.$(Cluster).$(Process)
 Output = out/job.out.$(Cluster).$(Process) Error = err/job.err.$(Cluster).$(Process)
Changed:
<
<
transfer_input_files = CMSRunAnalysis.sh, CMSRunAnalysis.tar.gz, InputArgs.txt, Job.submit, cmscp.py, gWMS-CMSRunAnalysis.sh, input_files.tar.gz, run_and_lumis.tar.gz, sandbox.tar.gz
>
>
transfer_input_files = ../CMSRunAnalysis.sh, ../CMSRunAnalysis.tar.gz, ../InputArgs.txt, ../Job.submit, ../cmscp.py, ../gWMS-CMSRunAnalysis.sh, ../input_files.tar.gz, ../run_and_lumis.tar.gz, ../sandbox.tar.gz
 should_transfer_files = YES RequestCpus = 1 RequestMemory = 2000
Line: 39 to 59
 )
Added:
>
>
This configuration will submit only the first 4 jobs of the task.
 
Changed:
<
<
This will only submit the first 4 jobs of the task.

Add #!/bin/bash as first line of the run_job.sh file and make it executable.

Submit the task with:

>
>
Create the auxiliary directories and submit the task with:
mkdir out err log
 condor_submit task.jdl
Changed:
<
<
And check the status with:
>
>
You can check the status with:
 condor_q -nobatch
Added:
>
>
 -- MarcoMascheroni - 2018-06-06

Revision 22018-06-11 - LeonardoCristella

Line: 1 to 1
 
META TOPICPARENT name="TWiki.WebPreferences"
Changed:
<
<
This twiki explains how to use the preparelocal CRAB command (https://twiki.cern.ch/twiki/bin/view/CMSPublic/CRAB3Commands#crab_preparelocal) to send jobs to the CERN condor pool (http://batchdocs.web.cern.ch/batchdocs/local/submit.html)
>
>
This twiki explains how to use the preparelocal CRAB command to send jobs to the CERN condor pool (http://batchdocs.web.cern.ch/batchdocs/local/submit.html)
 
Changed:
<
<
cmsrel CMSSW_8_0_29 #CMSSW version taken from https://cmsweb.cern.ch/crabserver/ui
cd CMSSW_8_0_29
cmsenv
source /cvmfs/cms.cern.ch/crab3/crab.sh
mkdir ~/wf/test
cd ~/wf/test
crab remake --task=180522_171053:atittert_crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
crab preparelocal crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
cd /afs/cern.ch/user/m/mmascher/wf/test/crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2/local
mkdir out; mkdir err; mkdir log
>
>
cmsrel CMSSW_8_0_29 #CMSSW version taken from https://cmsweb.cern.ch/crabserver/ui
cd CMSSW_8_0_29
cmsenv
source /cvmfs/cms.cern.ch/crab3/crab.sh
mkdir ~/wf/test
cd ~/wf/test
crab remake --task=180522_171053:atittert_crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
crab preparelocal crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
cd crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2/local
mkdir out err log

  Create the following condor JDL file (maybe it can be automated by the preparelocal?):
Added:
>
>
Universe  = vanilla
Executable = run_job.sh
Arguments = $(I)
Log       = log/job.$(Cluster).$(Process).log
Output    = out/job.out.$(Cluster).$(Process)
Error     = err/job.err.$(Cluster).$(Process)
transfer_input_files     = CMSRunAnalysis.sh, CMSRunAnalysis.tar.gz, InputArgs.txt, Job.submit, cmscp.py, gWMS-CMSRunAnalysis.sh, input_files.tar.gz, run_and_lumis.tar.gz, sandbox.tar.gz
should_transfer_files = YES
RequestCpus = 1
RequestMemory = 2000
when_to_transfer_output = ON_EXIT
+JobFlavour = "workday"
Queue I from (
1
2
3
4
)
 
Deleted:
<
<
Universe = vanilla

Executable = run_job.sh
Arguments = $(I)

Log = log/job.$(Cluster).$(Process).log
Output = out/job.out.$(Cluster).$(Process)
Error = err/job.err.$(Cluster).$(Process)
transfer_input_files = CMSRunAnalysis.sh, CMSRunAnalysis.tar.gz, InputArgs.txt, Job.submit, cmscp.py, gWMS-CMSRunAnalysis.sh, input_files.tar.gz, run_and_lumis.tar.gz, sandbox.tar.gz

should_transfer_files = YES
RequestCpus = 1
RequestMemory = 2000

when_to_transfer_output = ON_EXIT

+JobFlavour = "workday"

Queue I from (
1
2
3
4
)

  This will only submit the first 4 jobs of the task.
Changed:
<
<
Add #!/bin/bash at the beginning of run_job.sh and make it executable.
>
>
Add #!/bin/bash as first line of the run_job.sh file and make it executable.
  Submit the task with:
Changed:
<
<
>
>
 condor_submit task.jdl
Added:
>
>
  And check the status with:
Changed:
<
<
>
>
 condor_q -nobatch
Changed:
<
<
>
>
 -- MarcoMascheroni - 2018-06-06 \ No newline at end of file

Revision 12018-06-06 - MarcoMascheroni

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="TWiki.WebPreferences"
This twiki explains how to use the preparelocal CRAB command (https://twiki.cern.ch/twiki/bin/view/CMSPublic/CRAB3Commands#crab_preparelocal) to send jobs to the CERN condor pool (http://batchdocs.web.cern.ch/batchdocs/local/submit.html)

cmsrel CMSSW_8_0_29 #CMSSW version taken from https://cmsweb.cern.ch/crabserver/ui
cd CMSSW_8_0_29
cmsenv
source /cvmfs/cms.cern.ch/crab3/crab.sh
mkdir ~/wf/test
cd ~/wf/test
crab remake --task=180522_171053:atittert_crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
crab preparelocal crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2
cd /afs/cern.ch/user/m/mmascher/wf/test/crab_nmssmP1SignalCascadeV01_13TeV2016_processMc01_ed80X_P1_v1_1600sq_1610go_180X2/local
mkdir out; mkdir err; mkdir log

Create the following condor JDL file (maybe it can be automated by the preparelocal?):

Universe = vanilla

Executable = run_job.sh
Arguments = $(I)

Log = log/job.$(Cluster).$(Process).log
Output = out/job.out.$(Cluster).$(Process)
Error = err/job.err.$(Cluster).$(Process)
transfer_input_files = CMSRunAnalysis.sh, CMSRunAnalysis.tar.gz, InputArgs.txt, Job.submit, cmscp.py, gWMS-CMSRunAnalysis.sh, input_files.tar.gz, run_and_lumis.tar.gz, sandbox.tar.gz

should_transfer_files = YES
RequestCpus = 1
RequestMemory = 2000

when_to_transfer_output = ON_EXIT

+JobFlavour = "workday"

Queue I from (
1
2
3
4
)

This will only submit the first 4 jobs of the task.

Add #!/bin/bash at the beginning of run_job.sh and make it executable.

Submit the task with:

condor_submit task.jdl

And check the status with:

condor_q -nobatch

-- MarcoMascheroni - 2018-06-06

 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback