Run 2 Production Management

This page shall provide special infos for production management of data processing during LHC Run2. For general info on production management see the Production Management Guide

Before you start, you need to upload your "lhcb_prmgr" proxy with lhcb-proxy-init -g lhcb_prmgr --upload

Which productions to setup

  • Per magnet polarity there are 6 requests to handle which generate 8 transformations. If a new request needs to be made the easiest will be to "duplicate" (as lhcb_prmgr) an old request which is close to the new one and make the necessary changes in there.

  • Streams to be processed (2017)
      • 90000000 - FULL stream
        • RDST files uploaded to Tier1-Buffer
        • Stripping production to be launched, .DST,.MDST uploaded to Tier1-DST
      • 96000000 - NOBIAS stream
        • FULL.DST files uploaded to Tier1-DST
      • 90600000 - LOWMULT stream
        • FULL.DST files uploaded to Tier1-DST
      • 94000000 - TURBO stream
        • set of *.MDST streams, merged and uploaded to Tier1-DST
      • 95100000 - TURCAL stream
        • FULLTURBO.DST files uploaded to Tier1-Buffer

Additional info

  • In the "everytingElse" template when specificying the production types don't leave a space between those, e.g. correct "Turbo,Merge", otherwise the production type is not found
  • The histograms are always uploaded to "CERN-HIST-EOS" storage element without the need to specify anything special.

some notes about the screenshots

  • the screenshots of the production generation templates below show the validation production generations for MagDown data of one of the last tests, everything with a "red triangle" is an option which has changed from the default. These changes can be taken for the early measurements productions - except
    • the run ranges, if specified
    • the line "GENERAL: Set true for validation prod" needs to be left at the default "False" value such that we output into the "LHCb" configuration in the Bookkeeping instead of "/validation"

For each of the workflows there is also a diagram with explanations of the requests, transformations, steps and output file formats attached.

When you create new productions for data processing in “real production” please also mark them as “hot”, i.e. as lhcb_prmgr right click on the production then “Hot”—>”Add flag”.


90000000 - Full


  • This is the "normal" processing we have also done during Run2, i.e. Reconstruction & Stripping/Merging
  • If the Reconstruction outputs into a new processing pass you need to wait for at least the first Reco job to finish such that the complete BK path is generated. This path is then needed for the input data selection in the Stripping request
  • Templates for both production generations are is the "" one
  • For Stripping we are running with 1 input file, see also screenshot where group size = 1

Workflow Diagram


Production Generation Screenshots





94000000 - Turbo

Workflow Diagram

Note: the left "stream" is the "Turbo Processing", the right one is "Turbo Validation"


"Turbo Processing"


  • In the generation of the Turbo production make sure there is NO space between "Turbo,Merge" in the generation template, otherwise the production type is not found by the Dirac agents
  • Production generation template is ""

Production Generation Screenshot


95100000 - Turcal


  • Production generation template is ""
  • Output destination of FULLTURBO.DST files should be set to Tier1-Buffer and not Tier1-DST

Workflow Diagram


Production Generation Screenshot


96000000 - Nobias and 90600000 - LowMult


  • Production generation template is ""

Workflow Diagram



Production Generation Screenshot


Run II "post TS2" workflows

Checklist (Wishlist)

1. Get the current magnet polarity from the "LHCb page 1" of 2. Get the right runs from the Online team. Good runs should be marked as "OFFLINE IN BKK" in Run DB

  • Would be nice to extract runs from the Run DB following LHC schedule, e.g. post TS2 or post MD1
3. Collect the steps for the several workflows. Currently we create transformations for:
  • Reconstruction of FULL stream (data type 90000000)
    • This transformation requires two step, one for Reco and one for DQ
  • Reconstruction of NoBias and LowMult streams (data types 96000000, 90600000)
    • This transformation requires one step for Reco
  • Turbo Processing (data type 94000000)
    • This transformation requires two steps, one for Turbo and one for Merging
  • Turbo Calibration (data type 95100000)
    • This transformation requires two steps, one for Reco and one for Turbo Calibration
  • Stripping on FULL stream
    • This transformation requires two steps, one for Stripping data and one for Merging
  • Steps are available from the Step Manager. It would be nice that, once we declare an incoming campaign for data processing, people in charge of preparing these steps are promptly informed in order to have the steps ready in a reasonable time interval (e.g. 7 days or so). In this way people could check that all the relevant applications involved in the productions (Brunel, DaVinci, Tesla...) are available and with the right version.
4. Duplicate a given transformation/request. Find out the latest transformation for the desired data type (e.g. 90000000), duplicate it and clean the processing steps, which are likely to be outdated. Insert the right steps for the desired production.
  • The flag of DataQuality must be set to OK+UNCHECKED
  • Note that this is true only for first pass processing, that is for the very first full processing chain. In case of reprocessing the DQ flag must be set to OK. The reason is that DQ guys are not always "fast enough" to flag bad data so, as a first approximation, we process all data. If then they decide to flag something BAD, in the second pass we can avoid picking those bad runs.

5. Sign the request.

  • would be nice if the prmgr role could skip the sign procedure and go directly to the Generation part, especially considering that the new portal clears the selections, changing the role, and force the user to select again the right transformation request
6. The Generation steps are almost the same as the EM workflows, except for the Stripping one, where you should put 1 in the field PROD-2: Ancestor depth to allow the production to navigate back to RAW data.
  • For pXX (where XX is Ar, He, Ne) Reconstructed data should go directly into Tier1-DST, in case a further stripping processing is needed, otherwise they go in the usual Tier1-BUFFER destination
  • For the next round of productions we'll need to add "MDST.DST":"Tier1-RDST" as an optional SE, while the other streams will go to Tier1-DST.

How to recover lost files

To reproduce some lost files these are the steps to perform:

1. Duplicate the request from the original production (48237 for the lost RDSTs) keeping the processing steps

2. Submit and sign as usual (tech, ppg)

3. Edit the production filling the usual parameters and avoiding to provide the run range in case of reco prods

4. DO NOT push the "generate" button, but save the Python script (from ScriptPreview) somewhere on your machine, calling it, say,

5. Open and modify the saved script commenting out the line

# pr.bkQueries = ['Full']
6. and adding
  pr.bkQueries = ['']

WARNING: the lines need to be modified in the right place, e.g. under w1 for DataReconstruction or under w2 for DataStripping

7. Open a shell on lxplus and issue the usual commands to work with LHCbDirac in a Grid environment

lb-run LHCbDirac/latest lhcb-proxy-init

8. launch the script via:

lb-run LHCbDirac/latest python

This will output a production number

9. Add input files with the dirac command:

lb-run LHCbDirac/latest dirac-production-add-files <prod number> <-l LFN> or <--File file-with-lfns.txt>

MarcoCorvo - 2016-02-29

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng 20150625-Full-Reco15b-MagDown-Validation.png r1 manage 582.9 K 2015-06-30 - 10:41 StefanRoiser  
PNGpng 20150625-Full-Stripping22onReco15b-MagDown-Valdiation.png r1 manage 605.6 K 2015-06-30 - 10:41 StefanRoiser  
PNGpng 20150625-NoBias-Reco15b-Validation.png r1 manage 521.2 K 2015-06-30 - 10:41 StefanRoiser  
PNGpng 20150625-Turbo-Turbo01b-MagDown-Validation.png r1 manage 613.8 K 2015-06-30 - 10:41 StefanRoiser  
PNGpng 20150625-TurboCalibration-Reco15b_Turbo01b-MagDown-Validation.png r1 manage 578.4 K 2015-06-30 - 10:41 StefanRoiser  
PNGpng 20150625-TurboVal-Reco15b_Turbo01b-MagDown-Validation.png r1 manage 605.3 K 2015-06-30 - 10:41 StefanRoiser  
PNGpng 20150625-TurboVal-Stripping22onReco15b_Turbo01b-MagDown-Valdiation.png r1 manage 613.4 K 2015-06-30 - 10:41 StefanRoiser  
JPEGjpg Run2_-_Full_Stream_Processing.jpg r1 manage 60.3 K 2015-06-30 - 10:58 StefanRoiser  
JPEGjpg Run2_-_Nobias_Stream_Processing.jpg r1 manage 30.2 K 2015-06-30 - 10:58 StefanRoiser  
JPEGjpg Run2_-_Turbo_Stream_Processing.jpg r1 manage 78.7 K 2015-06-30 - 10:58 StefanRoiser  
JPEGjpg Run2_-_Turcal_Stream_Processing.jpg r1 manage 38.3 K 2015-06-30 - 10:58 StefanRoiser  
PNGpng dataprocessing-fullstream2017.png r1 manage 86.9 K 2017-07-31 - 22:39 StefanRoiser  
PNGpng dataprocessing-lowmultstream2017.png r1 manage 31.9 K 2017-07-31 - 22:39 StefanRoiser  
PNGpng dataprocessing-minibiasstream2017.png r1 manage 30.4 K 2017-07-31 - 22:39 StefanRoiser  
PNGpng dataprocessing-nobiasstream2017.png r1 manage 77.9 K 2017-07-31 - 22:39 StefanRoiser  
PNGpng dataprocessing-turbostream2017.png r1 manage 59.3 K 2017-07-31 - 22:39 StefanRoiser  
Edit | Attach | Watch | Print version | History: r15 < r14 < r13 < r12 < r11 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r15 - 2018-09-23 - MarcoCattaneo
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LHCb All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2022 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback