Week of 180402

WLCG Operations Call details

  • For remote participation we use the Vidyo system. Instructions can be found here.

General Information

  • The purpose of the meeting is:
    • to report significant operational issues (i.e. issues which can or did degrade experiment or site operations) which are ongoing or were resolved after the previous meeting;
    • to announce or schedule interventions at Tier-1 sites;
    • to inform about recent or upcoming changes in the experiment activities or systems having a visible impact on sites;
    • to provide important news about the middleware;
    • to communicate any other information considered interesting for WLCG operations.
  • The meeting should run from 15:00 Geneva time until 15:20, exceptionally to 15:30.
  • The SCOD rota for the next few weeks is at ScodRota
  • General information about the WLCG Service can be accessed from the Operations Portal
  • Whenever a particular topic needs to be discussed at the operations meeting requiring information from sites or experiments, it is highly recommended to announce it by email to wlcg-scod@cernSPAMNOTNOSPAMPLEASE.ch to allow the SCOD to make sure that the relevant parties have the time to collect the required information, or invite the right people at the meeting.

Best practices for scheduled downtimes

Monday: Easter Monday holiday

  • The meeting will be held on Tuesday instead.



  • local: Herve (storage), Maarten (SCOD + ALICE), Vincent (security)
  • remote: Christoph (CMS), Di (TRIUMF), Jens (NDGF), Xin (BNL)

Experiments round table:

  • CMS reports ( raw view) -
    • Good CPU utilization
      • ~180k cores for production
      • ~50k cores for analysis
    • No major issues
    • Started first round of transfer tests from CERN to T1 tape endpoints
      • Progress is being tracked in various GGUS tickets

  • ALICE -
    • Normal activity on average again since ~1 week

Sites / Services round table:

  • ASGC:
  • BNL: NTR
  • CNAF:
  • EGI:
  • FNAL: Downtime for disk and tape storage tomorrow from 8AM-5PM FNAL time. Catching up on kernel and system updates.
  • IN2P3: NTR
  • KISTI:
  • KIT:
    • Deploying storage milestone resources...
      • Alice: File system extension will (most likely) happen sometime this week.
      • ATLAS: Will increase the ATLASDATADISK space reservation by 400 TB (≈ 8.2 PB) once internal data rebalancing concluded.
      • CMS: 400 TB were added to the disk buffer for T1, 1 PB for T3.
      • LHCb: Need to prepare a couple of new servers first, since their current storage system cannot be extended anymore.
    • We have noticed last week, that CMS tried to access files on our tape archive, that had been declared lost Nov 2017 (GGUS:132078). It seems that the recovery procedure on the CMS side (reimport from other site) has failed in this instance.
  • NL-T1:
  • NRC-KI:
  • OSG:
  • PIC:
  • RAL:

  • CERN computing services:
  • CERN storage services:
    • EOS (and CASTOR) were affected by a router incident corrupting packets (OTG:0043178). A list of files written during the incident will be sent to the experiments so that they check whether files are corrupted.
  • CERN databases:
  • GGUS:
    • Over the Easter holidays the tomcat server of the GGUS remedy server went down twice (29.03. 22:40 UTC and 31.03. 20:30 - not yet clear why).
      • The GGUS team was informed about that by the on-call service early the next morning and took care of that.
    • Today from 6:40 to 08:15 GGUS was not reachable. A network switch was broken and had to be replaced.
  • Monitoring:
  • MW Officer:
  • Networks:
  • Security:
    • EGI and OSG advisories were sent regarding the configuration of Singularity


Edit | Attach | Watch | Print version | History: r10 < r9 < r8 < r7 < r6 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r10 - 2018-04-03 - MaartenLitmaath
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LCG All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback