TWiki> Main Web>TWikiGroups>CernLbd>CernLbdLhcbt3 (revision 7)EditAttachPDF


This is the LHCb CERN LBD group page describing the small cluster lhcbt3

Local, batch, grid

What is lhcbt3?

lhcbt3 is a cluster of lxbatch machines dedicated to the CERN lbd group. It is very small (80 cores) and is therefore much smaller than the T3 resources available at most other institutes.

What is lhcbt3 for?

lhcbt3 is for Lbd users to exploit as a middle ground between testing jobs locally and sending them out onto the grid.

It is not the grid (80 vs 80-thousand nodes), and in almost all use cases, the grid is much better for whatever it is you are doing.

t3 is clearly complimentary to the grid in the following cases:

  • Testing jobs on a batch system before Grid submission
  • Running executables which do not function on the Grid
  • Running code against the LHCb nightly builds, which are not available on the Grid
  • Running a small number of jobs which need an instantaneous turn-around
  • Running a small number of jobs which need oodles of CPU and cannot be broken down for the Grid
  • If you're a short-term student or visitor without a grid certificate
  • When you have no access to a similar institute resource

How do I get access?

All lbd group members should have access to lhcbt3, see the list here

If not, get in touch with ThomasRuf, RogerForty or RobLambert.

If you are not in LBD, but you are collaborating with somebody who is, and/or you fulfil many of the use cases in What is it for then you probably will be allowed temporary access to complete your tasks.

How do I use it?

- Submit to batch

Specify the queue of lhcbt3, either in ganga with j.backend=LSF(queue='lhcbt3'), or on the command line with bsub -q lhcbt3

- Get an interactive session

You can request an interactive session which has no CPU or time limit. To log on to an lhcbt3 node interactively on has to submit a special kind of job on lxplus by doing:

  • bsub -Is -q lhcbt3inter /bin/tcsh -l (note the first vertical character is a capital "i" while the one at the end is a lower case "L")
  • bsub -Is -q lhcbt3inter -m MyPreferredNode /bin/tcsh -l In order to return to a specific node, MyPreferredNode

A convenient way to use this is by making an alias which then takes the node name as an argument, e.g.

  • alias clnode 'bsub -Is -q lhcbt3inter -m \!:1 /bin/tcsh -l'

Please limit the number of interactive sessions you open at any given time as the total number is not unlimited and hence you might prevent other users from accessing the cluster.

- Associated Castor space

30TB of disk is assosciated to lhcbt3. To get your hands on it, see CernLbdDiskSpace.

- Monitoring:


- Getpack doesn't work, or asks me for my password 11 times

You don't have the right kerberos tokens, try kinit to get new tokens.

- I can't access castor files, I see an error about /tmp/x509_####

If you see an error like:

EventSelector                              INFO Stream:EventSelector.DataStreamTool_287 Def:   DATAFILE='root://' TYP='POOL_ROOTTREE' OPT='READ'
1148586304:error:02001002:system library:fopen:No such file or directory:bss_file.c:352:fopen('/tmp/x509up_u7255','r')
1148586304:error:20074002:BIO routines:FILE_CTRL:system lib:bss_file.c:354:
1148586304:error:140DC002:SSL routines:SSL_CTX_use_certificate_chain_file:system lib:ssl_rsa.c:720:
XrdSec: No authentication protocols are available.
Error in <TXNetFile::CreateXClient>: open attempt failed on root://

You need a grid proxy. Create one and all should be fine.

-- RobLambert - 11-Nov-2010

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng GridTest.png r1 manage 161.2 K 2010-11-15 - 10:01 RobLambert Simple flow chart
Edit | Attach | Watch | Print version | History: r10 < r9 < r8 < r7 < r6 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r7 - 2010-12-01 - RobLambert
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback