Week of 190204

WLCG Operations Call details

  • For remote participation we use the Vidyo system. Instructions can be found here.

General Information

  • The purpose of the meeting is:
    • to report significant operational issues (i.e. issues which can or did degrade experiment or site operations) which are ongoing or were resolved after the previous meeting;
    • to announce or schedule interventions at Tier-1 sites;
    • to inform about recent or upcoming changes in the experiment activities or systems having a visible impact on sites;
    • to provide important news about the middleware;
    • to communicate any other information considered interesting for WLCG operations.
  • The meeting should run from 15:00 Geneva time until 15:20, exceptionally to 15:30.
  • The SCOD rota for the next few weeks is at ScodRota
  • General information about the WLCG Service can be accessed from the Operations Portal
  • Whenever a particular topic needs to be discussed at the operations meeting requiring information from sites or experiments, it is highly recommended to announce it by email to wlcg-scod@cernSPAMNOTNOSPAMPLEASE.ch to allow the SCOD to make sure that the relevant parties have the time to collect the required information, or invite the right people at the meeting.

Best practices for scheduled downtimes



  • local: Borja (Chair, Monitoring), Gavin (Computing), Julia (WLCG), Maarten (ALICE), Priscilla (ATLAS), Vladimir (LHCb)
  • remote: Di (TRIUMF), Dave M (FNAL), Dmytro (NDGF), John (RAL), Jose (PIC), Marcelo (CNAF), Xin (BNL)

Experiments round table:

  • ATLAS reports ( raw view) -
    • Smooth running at 300-325k job slots. T0 slots always between 22.5-23K.
    • nothing else major to report

  • CMS reports ( raw view) -
    • This week is CMS week. No one from CMS will likely join the call
    • Very good CPU utilization: ~200k cores production, ~50k cores analysis
    • No major problems

  • ALICE -
    • NTR

LHCb pointed out the site issues and mentioned some ticket even if they looked solved they are still open, Maarten explained that first priority is always to address the problem and closing tickets may be delayed by other tasks. LHCb will ping the ticket reminding the issue is solved.

There were some updates for RAL failures during the meeting, they are all reflected on the above mentioned GGUS ticket.

Sites / Services round table:

  • ASGC: NC
  • BNL:
    • after dCache was upgraded to newer version 4.2, deletion of data in dCache runs much faster now, this long standing issue is solved.
    • we will reduce our SCRATCHDISK space by half (~800TB), per ADC request.
  • EGI: NC
  • IN2P3: NTR
  • KIT: NTR
  • NDGF: Reduced bandwidth to the site tomorrow due to OTG:0048051
  • NL-T1:
  • NRC-KI: NC
  • OSG: NC
  • PIC: Nothing to add to LHCb site issue report
  • RAL: NTR

  • CERN computing services:
    • OTG:0046088: LSF public deco last Wednesday - now closed.
    • OTG:0047300: HTCondor is now at ~30% CC7, proceeding with draining to get to 50% as per:
      • end March 2019: 50% public/grid will be CC7
      • 2nd April 2019: lxplus.cern.ch alias change to CC7 (lxplus6 service will remain accessible on lxplus6.cern.ch), Default HTCondor target change to CC7 for local submission.
      • early June 2019: remainder of capacity will have been migrated.
      • (Dedicated shares handled separately.)
    • OTG:0048002 ce511, ce512, ce513, ce514 now default to CC7.

  • CERN storage services: NC
  • CERN databases: NC
  • Monitoring:
    • Final availability reports for December 2018 sent
    • Draft availability reports for January 2019 sent
  • MW Officer: NC
  • Networks: NTR
  • Security: NTR


Edit | Attach | Watch | Print version | History: r17 < r16 < r15 < r14 < r13 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r17 - 2019-02-04 - MaartenLitmaath
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LCG All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback