Open computing and software tasks

These tasks are part of the LS1 tasks list. If you interested in any of these tasks please contact Marco Cattaneo, Tim Gershon and Patrick Koppenburg.

Implementation of MicroDST formats for MC data

Task name Implementation of MicroDST formats for MC data
Lead (tools, or other) group Software/Computing
Other relevant groups (if any) Simulation, All physics WGs interested in large statistics MC simulations
Task description Currently the full DST is written for MC events (~400kB/event). It should be possible to achieve a factor 10 reduction by removing not needed information (e.g. HepMC event) and adopting tricks used for the real data microDST to save only relevant MC truth info. Since the relevant info may be analysis dependent, rather than a single format one should provide tools that allow to customise the format for each production.
Estimated total effort required (FTE) ~ 1 FTE for a few months for programming. This part is done. Now need dedicated contacts in each WG to provide requirements and feedback
Deadline ASAP
People/groups currently involved B2OC WG, Chris Jones
New effort required? Yes
Other comments This is an extremely useful tool for simulation, where usable statistics are currently limited by storage. TG/PK: Need to understand use cases; interaction with PAWGs

Software validation

Task name Software validation
Lead (tools, or other) group Software/Computing
Other relevant groups (if any) All physics analysis WGs, data quality
Task description For each release to be used in production, run an extensive set of physics related checks to ensure that we don't have any regressions in the physics performance.
Estimated total effort required (FTE) 0.2 FTE
Deadline Continuous. Significant effort required in the build-up to the 2015 data taking.
People/groups currently involved ???
New effort required? Yes
Other comments Requires good knowledge of LHCb software, and close interaction with the core software team as well as with physics WGs (via liaisons). TG/PK: Not clear how to organise this further discussion needed. Validation must cover both Brunel and DaVinci (c.f. incremental restripping discussion). Additional information from Marco: the job advert (for CMS) below gives a good description of what I had in mind when proposing an equivalent activity in LHCb, and could be cut and pasted almost as is for one of your common tasks. Some parts of what is described could be provided by the core software team (e.g. tools to measure CPU and memory usage) but the main work of coordinating the physics performance validation requires new blood.

CMS advert:

You will be responsible for the validation of the reconstruction and physics analysis code in a leading role in the Physics data and Monte-Carlo Validation (PdmV) group in the Compact Muon Solenoid (CMS) Physics Performance and Datasets level-1 project. You will play a leading role in the Physics data and Monte-Carlo Validation (PdmV) group in the area of the new Physics Performance and Datasets (PPD) level-1 project in CMS.

The PdmV group was created in the beginning of 2012 to manage the complex validation of new CMS software releases for the data reprocessing as well as the Monte Carlo production campaigns. It is instrumental for the coordination of the evaluation of Physics performance and validation of Monte Carlo and data samples coming from (pre-) production campaigns (RelVals), prompt reconstruction, rereconstructions and skims. The work is done in close collaboration with the corresponding experts for the detectors (DPGs), physics objects (POGs), and the physics analysis groups (PAGs)

You will: * Coordinate the physics validation campaigns with the developers, mainly the physicists in the DPGs, POGs and PAGs. * Provide the developers with tools and methods to check the physics performance of the algorithms used and help to improve the design of the analysis objects towards better maintainability of their code. * Extensively test and validate newly developed algorithms and evaluate their impact on CPU and memory resource usage as well as their physics output aiming at improving the overall performance of the physics code. * Evaluate the physics content and output of the DQM modules from the physics groups to optimize their usage as a validation tool. This work will be done in close collaboration with the corresponding developers and physics groups.

Your main activities will consist of:

  • Co-coordinating the Physics Data and Monte-Carlo Validation (PdmV) project in the CMS Physics
Performance and Dataset (PPD) level 1 project.
  • Coordinating with the various teams to ensure that the validation of the various workflows for data
and Monte Carlo is done timely and the results are evaluated for their physics performance.

More tasks

All computing project activities fall under this hat. Pete has agreed to make a comprehensive review of manpower in computing during the coming months
Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r2 - 2013-09-05 - PatrickSKoppenburg
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LHCb All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2022 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback