• Title
CRAB3: Establishing a new generation of services for distributed analysis at CMS
• Abstract
In CMS Computing the highest priorities for analysis tools are the improvement of the end users' ability to produce and publish reliable
samples and analysis results as well as a transition to a sustainable
development and operations model.
To achieve these goals CMS decided to incorporate analysis processing into
the same framework as the data and simulation processing.
This strategy foresees that all workload tools (Tier0, Tier1, production,
analysis) share a common core which allows long term maintainability as
well as the standardization of the operator interfaces.
The re-engineered analysis workload manager, called CRAB3, makes use of
newer technologies, such as RESTful based web
services,
NoSQL Databases aiming to increase the scalability and reliability of
the system.
As opposed to CRAB2 in CRAB3 all work is centrally injected and managed in a global queue. A
pool of agents, which can be geographically distributed, consumes work
from the central services, servicing the user tasks. The new architecture of
CRAB substantially changes the deployment model and operations activities.
In this paper we present the implementation of CRAB3 emphasizing how the
new architecture improves the workflow automation and simplifies
maintainability. We will highlight, in particular, the impact of the new
design on daily operations.
• Author
Daniele Spiga (CERN)
• Co-authors
Mattia Cinquilli (CERN)
CMS DMWM Group
• Presentation type
parallel
--
SpigaDaniele - 20-Sep-2011