Using CRAB to Submit Jobs on a PBS Cluster

Prerequisites

  • CRAB version higher or equal to 2.7.0;
  • The pbs_python module installed (how to install this module here);

Running

  • In the crab.cfg configuration file, you have just to put under the [CRAB] section:
    scheduler = pbs
    use_server = 0

  • In the [PBS] section of crab.cfg, use the following optional parameters:
    • queue = pbs_queue_to_use,
      putting default you use the default queue in your local PBS config
    • resources = resource_1 = value, resource_2 = value, etc
    • A concrete example (for a particular local batch queue) is:

[PBS]
queue=prod
resources=cput=12:00:00,walltime=12:00:00,vmem=4gb,mem=1gb

Both wrapper stdout/stderr and job script output files are placed in your crab_*/res directory by default

More info

If the job output (stdout etc.) does not go to the directory that you specified with the ui_working_dir parameter in your crab.cfg, then adding the following lines to your .login file may help (or equivalent if you use bash). (This works for some sites, but may not work on all):

# If this is a PBD job, the PBS_JOBCOOKIE variable determines where the output goes, whereas the TMPDIR one is set by ui_working_dir.
if ( $?PBS_JOBID && $?TMPDIR ) then
  setenv PBS_JOBCOOKIE $TMPDIR
endif

additional information in these HN threads

-- MarcoCalloni - 13-Oct-2009

Edit | Attach | Watch | Print version | History: r8 < r7 < r6 < r5 < r4 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r8 - 2012-08-29 - IsidroGonzalezCaballero
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback