Data sets for trigger studies

Run II

Standard L0 filtered sample for 2016 HLT commissioning

This is the standard dataset that should be used for development of HLT lines by physics analysts It was produced from the NB data in the bw_division location on the farm node local disk corresponding to fill 4440 (25 ns 2015). After filtering with L0 TCK 0x0050 there are around 40k events left in the sample. Either pick up the files directly
from GaudiConf import IOHelper
IOHelper("MDF").inputFiles(['mdf:root://' %i for i in range(0,9)])
Or from the TestFileDB
from Configurables import Moore
from PRConfig import TestFileDB

The database tags of these data are:

from Configurables import Moore
Moore().DataType = "2015"
Moore().DDDBtag = "dddb-20150724"
Moore().CondDBtag = 'cond-20150828'

Hlt1 Accepted sample for 2016 HLT commissioning

50K events accepted by Hlt1 TCK 0x11291605, for use in 2016 trigger commissioning. The files are located here: eos/lhcb/wg/HLT/2016CommissioningDatasets/2016_Hlt1_0x11291605

The TestFileDB line is: TestFileDB.test_file_db["2016_Hlt1_0x11291605"].run(configurable=Moore())

The database tags of these data are:

from Configurables import Moore
Moore().DataType = "2016"
Moore().DDDBtag = "dddb-20150724"
Moore().CondDBtag = 'cond-20160522'

L0 filtered with 0x0050

This is the recommended sample for HLT commissioning studies for the August 2015 25 ns ramp. Thanks to Olli Lupton for producing these filtered samples. A subset of these files are listed below.

files = [ 
from Gaudi.Configuration import *
from GaudiConf import IOHelper

  • A longer list of EOS paths is here LHCbTriggerDatasets0x0050Olli, where files from 25ns NoBias, 50ns NoBias and 50ns leading-crossing NoBias are listed.
  • Note: This is the third "2015 NoBias processed with 0x0050" dataset that has been advertised. If you are using the older files please update.

L0 filtered with 0x0051

A smaller fraction of the leading-crossing NoBias has also been processed with a draft of the looser L0 TCK 0x0051 that will be used early in the 25ns ramp. The EOS paths for this sample are listed at LHCbTriggerDatasets0x0051Olli.

-- EricvanHerwijnen - 07-Dec-2010

Instructions for copying files from online system to eos

Files can be copied using the script HltPiquetScripts/scripts/ Example instructions, starting from an lxplus node, are:

From an lxplus mode:
ssh hltperf-action-x5650
source /cvmfs/
kinit <username>@CERN.CH
getpack HEAD HltPiquetScripts
lb-run Moore v25r2 python 'Hlt/HltPiquetScripts/scripts/' /eos/lhcb/wg/HLT/2016CommissioningDatasets/2016_Hlt1_0x11291605/ /localdisk/hlt1/sstahl/hlt1_0x1605/

The script takes two arguments: (1) an output directory and (2) one or more paths to the input files.

If there is a single path it can be a:
- A directory: everything in the directory will be copied.
- A plain text file containing one filename per line: these will be copied
- A python file: it should contain a dictionary named lfns with structure: {subdir : [list of files]}. These will be copied to the respective subdirectories.

If there are multiple paths, they are the files that should be copied. If the directory starts with /eos, it will copy to eos, otherwise it will copy to the local path, and the similarly for all input files.

Edit | Attach | Watch | Print version | History: r77 < r76 < r75 < r74 < r73 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r77 - 2016-06-11 - MarkRichardJamesWilliams
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LHCb All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback