L1 Emulator Configuration

Complete: 3

Goals of this page

This page describes how the L1 Emulator will be configured online and offline, for hardware validation and MC production.

Overview

The emulator modules are configured via EventSetup. See SWGuideEventSetupHowTos and SWGuideL1EmulatorConfigurationHowTos for more information. Configuration data is put into EventSetup by either dummy ESProducers or DB interface modules.

Use cases

The bit-wise emulation of the trigger hardware runs in CMSSW in the online event filter farm, where it is used (a) to seed the HLT and (b) to monitor the hardware by comparing the detector output with the emulated output bit by bit. For both of these use cases, the emulator and the hardware must be configured identically, and this synchronization must occur run by run.

For (a), only the Global Trigger (GT) needs to be emulated, but this must be done for every event. The GT emulator computes information needed by the HLT that is not in the hardware data stream. So, it must always be configured with the same trigger table that is currently used in the hardware. In principle, the trigger table can change run by run, and, at least at startup, it is foreseen to change rather frequently. Also, the HLT needs the energy scales that allow the ET of the trigger objects to be converted from hardware bits to physical values. These energy scales are also stored in OMDS as configuration data. For proper operation of the HLT, the above pieces of configuration data need to be synchronized between OMDS to ORCON on every run.

For (b), we run the full L1 emulator for all trigger subsystems (at least at startup; as we gain confidence in the hardware, we may emulate only the potentially problematic subsystems). But unlike HLT seeding, this is only done for a subset of events. Of course, in order for the hardware/emulator comparison to be meaningful, both must have the same lookup tables (LUTs), cut parameters, muon track templates, etc. However, the failure to synchronize the full configuration does not directly impact trigger performance, although it would result in the loss of an important diagnostic tool.

Avoiding unnecessary O2O

Because of the potentially high frequency and large data size involved in our O2O, we would like to perform O2O only when necessary; O2O should not be allowed to delay the start of a run. Typically, a given trigger configuration will be used for more than one run, but these runs may not be consecutive. For instance, at startup, we may toggle back and forth between two different configurations to study rates. Each configuration is identified by a Trigger Supervisor Configuration (TSC) key, which is a string stored in OMDS. For example:

Run Number TSC Key
1 A
2 B
3 A
4 B

Instead of copying the entire trigger configuration from OMDS to ORCON before each run, we wish to take advantage of the fact that Runs 1 and 3 share the same configuration, as do Runs 2 and 4.

In the O2O application we are developing, before every run, we check if the configuration objects to be used have previously been copied to ORCON. If they have, then we do not repopulate ORCON; we simply make new IOVs for the already existing payload tokens. Not only do we perform this check for the entire L1 configuration as a whole, but we also check each configuration object individually. With this finer granularity, we avoid copying objects that are already in ORCON.

To further reduce the loss of data because of O2O, we plan to split our O2O into two steps:

  1. The first part of O2O transfers the L1 configuration for the next run from OMDS to ORCON. This job must always be run, but the actual transfer only takes place if the configuration data were not previously transfered. I.e., there should be only one copy of each unique configuration in ORCON, and these can be reused in multiple runs. This job is run well before the shifter pushes the "Configure" button. It is not yet clear who will initiate this job.
  2. The second part of O2O is initiated by CMS.RunControl. This quick job involves minimal data transfer; it only sets the IOV of the configuration data in ORCON once the next run number has been chosen. This job must also always be run.

Subsystem channel masks and other "Run Settings" are expected to change relatively frequently. In order to avoid a proliferation of TSC keys, these Run Settings objects are decoupled from the TSC key. These objects can also be handled by the above scheme for avoiding unnecessary copies, although with a key heirarchy separate from the TSC key. A detailed description of the Online/O2O Run Settings mechanism can be found at CMS.L1TRunSettings.

DB Objects

For the status of the configuration objects for each subsystem, see SWGuideL1CondFormats.

Configuration data size

The L1 account in ORCON and ORCOFF will have configurations for these L1 subsystems:

  • Regional Calorimeter Trigger (RCT)
  • Global Calorimeter Trigger (GCT)
  • CSC Track Finder (CSCTF)
  • DT Track Finder (DTTF)
  • RPC (partial)
  • Global Muon Trigger (GMT)
  • Global Trigger (GT)

For CSC TPG, DTTPG, ECAL TPG, HCAL TPG, and RPC (partial), the trigger configurations live in the corresponding subdetector databases.

For all subsystems, the sum of all configuration constants results in a ~20 MB sqlite file.

Bookkeeping objects (CondFormats/L1TObjects)

In order to handle the bookkeeping for our scheme, we introduce a L1TriggerKey class that encapsulates the configuration keys for a given configuration. The data members are

  • a string for the TSC key
  • subsystem keys: top-level keys for each subsystem given above, one level down from TSC key
  • a map<record@type, object key> for the corresponding object keys, where record@type is a string that concatenates the EventSetup record and the C++ class name of a configuration object.

We also define a L1TriggerKeyList class that stores a continually updated map<TSC key, L1TriggerKey payload token>. This list keeps track of all the L1 configurations already in ORCON, as shown schematically in the following table:

Run Number TSC Key L1TriggerKeyList
1 A {A}
2 B {A,B}
3 A {A,B}
4 B {A,B}

L1TriggerKeyList also contains a map<record@type, map<object key, configuration object payload token> >, which performs a similar function for each configuration object. This map also contains entries for those objects (like subsystem channel masks) that are not tied to the TSC key.

The L1TriggerKeyList and L1TriggerKey objects can also be used offline to retrieve any arbitrary L1 configuration, not just the "currently valid" one.

EventSetup record dependencies (CondFormats/DataRecord)

L1TriggerKeyRcd depends on L1TriggerKeyListRcd because one should not create a L1TriggerKey if the given TSC key is already in L1TriggerKeyList.

Configuration object records depend on L1TriggerKeyRcd and L1TriggerKeyListRcd because L1TriggerKey gives the object keys that specify the data version in OMDS, and one should not create a configuration object if the object key is already in L1TriggerKeyList.

Dummy Configuration

Instructions for producing L1 configurations from dummy producers: SWGuideL1FakeConditions.

Database Configuration

The L1 O2O software and workflows are described at SWGuideL1CondDBTools.

The special online and O2O workflows for Run Settings data are described at L1TRunSettings.

Coding instructions for CMSSW developers can be found at SWGuideL1ConfigOnlineProd.

Instructions for testing payload writing and IOV setting online: SWGuideL1O2OTestJob

Information on O2O operations at P5: L1O2OOperations.

Notes about the L1 content of Global Tags: L1GlobalTagNotes.

Instructions for creating L1 tags for MC: L1MCTags.

Review Status

Editor/Reviewer and date Comments
JimBrooke - 08 Dec 2006 page content last edited
CMSUserSupport - 12 Mar 2007 moved page into swguide
JennyWilliams - 13 Mar 2007 editing to match swguide layout

Responsible: JimBrooke
Last reviewed by: MostRecentReviewer and date

Edit | Attach | Watch | Print version | History: r20 < r19 < r18 < r17 < r16 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r20 - 2013-11-06 - WernerSun



 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback