A.1 Specific Information for Remote Sites and Institutions

Complete: 5
Detailed Review status

Contents

Goals of this page:

This page is intended to provide links to CMS-affiliated, non-CERN, site-specific and institution-specific information that is essential for users working from these institutions to complete the tutorials and exercises in the workbook.

Introduction

Site administrators and/or system administrators are encouraged to provide any necessary information that deviates from the workbook instructions such that a CMS user working from your site/institution can accomplish all the tasks in the workbook. If the information is brief, you may include it directly in the current page. Otherwise, please provide links on this page pointing to the information. The information may reside on web pages that you maintain elsewhere, or you may create new workbook pages from this page. For the latter, see the twiki formatting help (available from any editing screen) and WorkBookContributors.

Examples include but are not limited to:

  • how to get computing privileges and accounts at the site/institution,
  • login information to local clusters that maintain the CMS environment,
  • information about grid resources (besides LCG) that are available, and so on.
We organize the sites/institutions by affiliation with Tier-1 sites.

#CERN

CERN

There are various facilities provided at CERN for CMS usage. Please see the facility page to see if it is available to you. To interact with CERN storage systems a set of tools is available.

U.S. Tier-1 site affiliates

U.S. CMS is a collaboration of US scientists participating in the CMS experiment. The CMS T1 site in the U.S. is at Fermilab (also known as FNAL), which houses the LHC Physics Center, a location for CMS physicists to find experts on all aspects of data analysis, particle ID, software, and event processing within the US, during hours convenient for U.S.-based physicists.

Find USCMS-specific software and computing information at User Computing. We collect some of the information from the USCMS web site here for convenience.

#CMS.FermiLab

Fermilab

Getting a Fermilab Account

You must be registered with Fermilab before you can get any accounts. NOTE: This can take 4-6 weeks to process for both new and yearly renewals, and be sure to follow instructions emailed to you. The following web page: How to get a CMS Computing Account at Fermilab, outlines all necessary steps to get a CMS specific account at Fermilab together with Fermilab registration and points you to the online forms that have to be filled out. Please comment to request the same account name at both CERN and Fermilab.

Login to CMSLPC

The most up to date instructions on how to get access to cmslpc-sl7 cluster and how to configure your /etc/krb5.conf and ~/.ssh/config are given at How to get access to the (CMSLPC) cluster.

You can get the CMSSW environment with the following command (tcsh shell) every time you login (or add to your ~/.tcshrc or ~.bash_profile to be done automatically upon login):

source /cvmfs/cms.cern.ch/cmsset_default.csh
or if you have changed your default shell to bash:
source /cvmfs/cms.cern.ch/cmsset_default.sh

And then you can Setup your CMSSW environment as you would elsewhere:

Only if not setup previously, make a new CMSSW

cmsrel CMSSW_11_0_1
To set the CMSSW environment:
cd CMSSW_11_0_1/src
cmsenv

Mass Storage

Users generally have an LPC FNAL EOS area automatically made with their account (eosls /store/YourUsername), but it is not automatically linked to your personal grid certificate, and therefore not usable with CRAB by default. You will need to follow the instructions to [[https://uscms.org/uscms_at_work/computing/LPC/usingEOSAtLPC.shtml#createEOSArea][Get your EOS area linked to your CERN username]], note that this step takes up to 1 business day to take effect.

Grid computing

The US CMS Grid is part of the worldwide LHC Computing Grid for LHC science analysis. The US CMS grid environment is part of the Open Science Grid (OSG) infrastructure. The instructions on page Starting on the GRID are valid for Fermilab users.

For information on grid certificates, Fermilab users should obtain CERN CA personal certificates CERN CA personal certificates.

CRAB

Like other sites, you have to set up the Grid UI, your CMSSW release, and then CRAB but the commands are slightly different. They are listed here.

CRAB, as set up at FNAL, has a couple of extra features that you may find useful. FNAL users can have their jobs run exclusively on the LPC tier3, which may in some cases allow their jobs to start faster. The pool of users who are requesting time on the tier3 might be smaller than the group of people in the global pool. Also, LPC users are the only ones allowed to have CRAB jobs run on T3_US_FNALLPC. Only a couple of modifications need to be made to the users CRAB configuration file for the jobs to be run here.

config.Data.ignoreLocality = True
config.Site.whitelist = ['T3_US_FNALLPC']
config.Site.ignoreGlobalBlacklist = True

More information about this can be found at the CRAB3HATS2016 twiki page twiki and the Indico page for the same event.

Batch system

The batch system available for users of the CMSLPC cluster is condor which allows the user to submit jobs into the production farm. The use of this batch system is described on the following page: Batch System Note that you will need to have Fermilab link your personal grid certificate in the CMS VO to your Fermilab account to use this batch system, to do that follow instructions here: link your grid certificate to Fermilab account: for both FNAL EOS and condor batch It is recommended that if you are submitting cmsRun jobs, you use the CRAB mechanism described above as this does everything correctly for you already. It is much easier, both for users and for the support group.

Getting help at Fermilab

Go to http://lpc.fnal.gov/computing/gethelp.shtml to find out how to get help with computing at the LPC CAF at Fermilab, in particular you can open a LPC Service Portal ticket.

Germany Tier-1 site affiliates

This information is included from original FSP-CMS Analysys Support page. No permission to view CMS.GermanUsersComputingSupport

DESY Hamburg

Information about local computing resources at DESY Hamburg can be found in LocalComputingIssuesHamburg.

Italy Tier-1 site affiliates

UK Tier-1 site affiliates

Imperial College London

Complete information for users willing to work from Imperial College London can be found in ImperialCollegeLondonWorkBook.

Login platforms at OSG Tier-2s

Some of the OSG Tier-2s provide a login platform for their "local users". To get access to this login platform you should email the site contact listed in CMS.SiteDB.

CVMFS should be available on all of the OSG sites. To get started, you generally need to first setup your environment by doing one of the following:
source $OSG_APP/cmssoft/cms/cmsset_default.csh
source $OSG_APP/cmssoft/cms/cmsset_default.sh
depending on what shell you are working in.

Review status

Reviewer/Editor and Date (copy from screen) Comments
MargueriteTonjes - 2-July-2021 major changes for Fermilab - point to uscms.org documents which are actively kept up to date and fixing links
JohnStupak - 15-September-2013 Review with minor changes
FedorRatnikov - 10-Feb-2010 added DCMS details
Main.fkw - 22 Jul 2007 added info on OSG Tier-2s login platforms
AnneHeavey - 03 Aug 2006 more additions to FNAL info
JennyWilliams - 05 Dec 2006 tidied up a bit
Main.gartung - 23 May 2007 updated instructions for running at Fermilab
AlanStone - 14 Oct 2008 Updated links to new USCMS & FNAL Computing docs

Responsible: SudhirMalik
Last reviewed by: SudhirMalik - 26 Nov 2008 (FNAL)

Edit | Attach | Watch | Print version | History: r54 < r53 < r52 < r51 < r50 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r54 - 2021-07-02 - MargueriteTonjes


ESSENTIALS

ADVANCED TOPICS


 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback