AtlasPublicTopicHeader.png

Trigger Operation Public Results

Introduction

Approved plots that can be shown by ATLAS speakers at conferences and similar events. Please do not add figures on your own. Contact the responsible project leader in case of questions and/or suggestions. Follow the guidelines on the trigger public results page.

2022 pp at 13.6TeV

Cost monitoring

Mean High Level Trigger (HLT) Event Processing time as a function of the average pile-up in a 2022 proton-proton run. The mean HLT Processing time decreases with decreasing average pile-up due to a reduction in event complexity. Additional trigger selections are enabled when the instantaneous luminosity falls approximately below 1.5 x 1034 cm-2s-1, as marked with a vertical line. The slope of the distribution changes at this point. Error bars denote the Gaussian width of the underlying per event measurements.
png pdf contact: AleksandraPoreba
An example of the High Level Trigger (HLT) Processing time distribution per event in a 2022 proton-proton run for an instantaneous luminosity of 1.8 x 1034 cm-2s-1. The total event time includes both time of algorithms' execution as well as time spent on framework operations (including algorithms scheduling, data traffic). The latter takes a visible fraction of short events and is unnoticeable in events with execution times longer than 100 ms. In the figure, three peaks can be identified, representing fast (approximately 30 ms), medium (approximately 300 ms), and slow (approximately 2s) events. The last type of the events are the rarest due to the early rejection mechanism.
png pdf contact: AleksandraPoreba

Trigger rates and bandwidth

The rate output to the High Level Trigger (HLT) streams in a 2022 proton-proton run. The total HLT event rate is lower than the sum of the stream rates, because the same events may be written to multiple streams. Periodic increases in the rate and bandwidth of support triggers are caused by prescale changes towards the end of the run as the luminosity and corresponding overall resource usage decline.
png pdf contact: AleksandraPoreba, TengJianKhoo
Level-1 (L1) physics trigger rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are rates of some representative single-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates, jet candidates, missing transverse momentum (MET), and tau candidates, with the indicated thresholds. The EM cluster trigger applies isolation requirements on the adjacent electromagnetic and hadronic energy. The muon trigger requires coincidence of hits in three muon stations in the endcaps, coincidence with detectors inside the toroid (not including the New Small Wheel), and masks hot regions of the detector. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
Level-1 (L1) physics trigger rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are rates of some representative multi-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates, jet candidates, and tau candidates, with the indicated thresholds. The EM cluster and tau triggers apply isolation requirements on the adjacent electromagnetic and hadronic energy. The muon triggers require coincidence of hits in three muon stations in the endcaps. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
High Level Trigger (HLT) rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are rates of some representative triggers, which have not been prescaled. The generic single electron and muon triggers provide a high rate of events to the main physics stream. The vector boson fusion (VBF) with dijet mass (mjj) selection and 4-jet triggers apply targeted selections to augment specific hadronic signatures in a dedicated physics stream with delayed reconstruction. The latter requires two jets identified as containing a B hadron using a combination of impact parameter and secondary vertex information (ATL-PHYS-PUB-2020-014). The additional labels indicate the isolation and identification working points. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
High Level Trigger (HLT) rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are rates of some representative triggers, which have not been prescaled. These trigger items are based on such objects as missing transverse momentum (MET), and jets. Particle Flow reconstruction is used to reconstruct all jets and MET. Residual pileup contamination in the MET is suppressed using a subtraction technique (JHEP 08 (2020) 80). The b-jet triggers require jets identified as containing a B hadron using a combination of impact parameter and secondary vertex information (ATL-PHYS-PUB-2020-014). The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
High Level Trigger (HLT) rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are rates of some representative triggers, which have not been prescaled. These trigger items are mainly designed to select long lived particles (LLP) based on hadronically decaying taus with LLP-like characteristics identified with a recurrent neural network (RNN), displaced electrons and muons, or a combination of missing transverse momentum (MET) and disappearing tracks or tracks that deposit a large amount of energy in the pixel detector (dE/dx). A standard tau trigger is shown for comparison. The displaced electron and muon triggers reconstruct tracks with larger transverse impact parameters than in the standard track reconstruction. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
High Level Trigger (HLT) rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are rates of some representative triggers, which have not been prescaled. These trigger items include selections on a single hadronically decaying taus and on pairs of taus, photons and muons with thresholds as noted in the legend. The dimuon trigger also requires a vertex to be reconstructed that is consistent with the J/psi mass. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
Cumulative trigger stream rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are the main physics stream rate, containing triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the VBF delayed stream, containing triggers for hadronic triggers, the express stream, which records events at a low rate for data quality monitoring; other minor streams with physics applications, such as zero-bias and background events; the trigger-level analysis (TLA) stream, containing only HLT objects; and the detector calibration streams. The legend order reflects the plot order, from top to bottom. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded. The total rate of HLT accepted events is indicated by a dashed line, and is lower than the sum over all streams due to events that are accepted in multiple streams.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
Cumulative trigger stream rates as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are the main physics stream rate, containing triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the VBF delayed stream, containing triggers for hadronic triggers, the express stream, which records events at a low rate for data quality monitoring; other minor streams with physics applications, such as zero-bias and background events; the trigger-level analysis (TLA) stream, containing only HLT objects; and the detector calibration streams. The legend order reflects the plot order, from top to bottom. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded. The total rate of HLT accepted events is indicated by a dashed line, and is lower than the sum over all streams due to events that are accepted in multiple streams.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
Cumulative trigger stream output bandwidth as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are the main physics stream rate, containing triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the VBF delayed stream, containing triggers for hadronic triggers, the express stream, which records events at a low rate for data quality monitoring; other minor streams with physics applications, such as zero-bias and background events; the trigger-level analysis (TLA) stream, containing only HLT objects; and the detector calibration streams. The legend order reflects the plot order, from top to bottom. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo
Cumulative trigger stream output bandwidth as a function of time in a fill taken in August 2022 with a peak luminosity of L = 1.7 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 48. Presented are the main physics stream rate, containing triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the VBF delayed stream, containing triggers for hadronic triggers, the express stream, which records events at a low rate for data quality monitoring; other minor streams with physics applications, such as zero-bias and background events; the trigger-level analysis (TLA) stream, containing only HLT objects; and the detector calibration streams. The legend order reflects the plot order, from top to bottom. The luminosity is levelled for a period at the beginning of the fill, after which point the rates decrease with the decaying luminosity. The discontinuous region with no data corresponds to an emittance scan where no physics data were recorded.
png pdf contact: FrancescoGiuli, SavannaShaw, TengJianKhoo

LS2

Cost monitoring

Link to ATL-COM-DAQ-2021-045

Example of measured time of all the algorithms executed on the thread as fraction of the monitored event time window. On the right plot the fractional time is showed as a function of the monitored event time window. Three peaks can be observed in the plots - one when thread processes algorithms 60% of total time, which happens for short events (approximately 100 ms), when the event data doesn't fulfill the requirements to trigger execution of time consuming algorithms. The other two peaks represent long events (approximately 300 ms and 3000 ms) when algorithm processing takes majority of the total time. The peaks correlate with recorded times of algorithm execution per event. For some of the events included in the data sample the algorithm processing takes 0% of the time, which is happening when the event does not fulfill requirements to trigger any of the algorithm executions. The rest of the time is spent for so-called "framework time" when a thread is performing framework related operations, including input/output and control flow handling. The plots were created with a small 2018 data sample that was preloaded into the HLT farm and processed repeatedly.

png png pdf contact: AleksandraPoreba
Example of measured framework time on one thread as fraction of the monitored event time window. On the right plot the fractional time is showed as a function of the monitored event time window. The framework time is defined as the time a thread spent outside of scheduled algorithms while waiting for an algorithm to be dispatched. It includes input/output and control flow operations. Three peaks can be observed - one when thread performs framework operations 40% of the time during short events processing (approximately 100 ms). The other two correspond to long events (approximately 300 ms and 3000 ms) when the framework time takes a minor fraction of the event time window. For some of the events included in the data sample the framework time takes 100%, what is happening when the event does not fulfill requirements to trigger any of the algorithm executions. The plots were created with a small 2018 data sample that was preloaded into the HLT farm and processed repeatedly.

png png pdf contact: AleksandraPoreba
Representation of Algorithm Summary table available on TriggerCostBrowser website containing details about the algorithm executions. They include number of events in which algorithm was activated, number of algorithm calls per event, rate of algorithm calls, rate of events in which algorithm was executed, algorithm call duration or total time of algorithm execution. Created based on reprocessing 2018 proton-proton collision data with the latest HLT software.
png pdf contact: AleksandraPoreba
Representation of Chain Summary table available on TriggerCostBrowser website containing details about the chain executions. They include groups the chain belongs to, number of events in which chain was activated, chain execution rate, number of algorithm calls that chain made and chain duration. Created based on reprocessing 2018 proton-proton collision data with the latest HLT software.
png pdf contact: AleksandraPoreba
Representation of Chain Item Summary on TriggerCostBrowser website that lists all algorithms related to a particular chain. In this example a jet reconstruction chain is presented. Each algorithm displays its class and the number of calls that it made ("AllChains calls"). Created based on reprocessing 2018 proton-proton collision data with the latest HLT software.
png pdf contact: AleksandraPoreba

Link to ATL-COM-DAQ-2021-053

Example of measured framework time, defined as the time a thread spends outside of scheduled algorithms while waiting for an algorithm to be dispatched. It includes input/output and control flow operations. For all events the framework time is stable and the mean value equals to 20 ms. The plot was created with a small 2018 data sample that was preloaded into the HLT farm and processed repeatedly.
png pdf contact: AleksandraPoreba

2018 pp at 13TeV

Trigger rates and bandwidth

The average recording rate of the main physics data stream and the B-physics and light states data stream for each ATLAS pp physics run taken in 2018. The total average of all runs is indicated as a red dash-dotted line, and the total average of the main physics stream is indicated as a blue dashed line.
png pdf contact: JoergStelzer, KateWhalen, DanieleZanzi
Physics trigger group rates at the High-Level Trigger (HLT) as a function of time in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 56. Presented are the rates of the individual trigger groups specific to trigger physics objects. Each of the groups contains single-object and multi-object triggers. Overlap between groups is only accounted for in the total main physics stream rate. The B-physics and Light States (LS) triggers are mainly muon-based trigger algorithms; the majority of the triggers are written to the B-physics and LS stream and are thus not included here. The combined group represents multiple triggers of different objects, as combinations of electrons, muons, taus, jets and missing transverse energy. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to a change of prescales to optimise the bandwidth usage or LHC luminosity re-optimisations; dips are due to dead-time and spikes are caused by detector noise.
pdf, png

contact: KateWhalen, DanieleZanzi, HeatherRussell
Trigger stream rates as a function of time in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are the main physics stream rate, containing all triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the express stream, which records events at a low rate for data quality monitoring; other minor streams with physics applications, such as zero-bias and background events; the trigger-level analysis (TLA) stream; the detector monitoring streams; and the detector calibration streams. The increase of the TLA HLT output rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, level-1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. This increases the total HLT output rate, but does not significantly increase the total output bandwidth due to the small size of TLA events.
pdf,png
contact: KateWhalen, DanieleZanzi, HeatherRussell
Output bandwidth at the HLT as a function of time in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are the main physics stream rate, containing all triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the express stream, which records events at a low rate for data quality monitoring; other minor streams with physics applications, such as zero-bias and background events; the trigger-level analysis (TLA) stream; and the detector calibration streams. The monitoring stream is not reflected in the output bandwidth as the monitoring data are not written out to disk. The increase of the trigger-level analysis (TLA) HLT output is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, level-1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. This increases the total HLT output rate, but does not significantly increase the total output bandwidth due to the small size of TLA events.
pdf, png
contact: KateWhalen, DanieleZanzi, HeatherRussell
Level-1 (L1) physics trigger rates as a function of time in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are rates of some representative single-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing transverse energy (XE) and tau candidates (TAU). The number in the trigger name denotes the trigger threshold in GeV. The letters following the threshold values refer to details of the selection: variable thresholds (V), hadronic isolation (H), and electromagnetic isolation (I). Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates increase periodically due to LHC luminosity re-optimisations; dips are due to dead-time and spikes are caused by detector noise.
pdf, png
Level-1 (L1) physics trigger rates as a function of time in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are rates of some representative single-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing transverse energy (XE) and tau candidates (TAU). The number in the trigger name denotes the trigger threshold in GeV. The letters following the threshold values refer to details of the selection: variable thresholds (V), hadronic isolation (H), and electromagnetic isolation (I). The total L1 rate is also shown. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates increase periodically due to LHC luminosity re-optimisations; dips are due to dead-time and spikes are caused by detector noise.
pdf, png
contact: KateWhalen, DanieleZanzi, HeatherRussell
Level-1 (L1) physics trigger rates as a function of instantaneous luminosity in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are rates of some representative single-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing transverse energy (XE) and tau candidates (TAU). The number in the trigger name denotes the trigger threshold in GeV. The letters following the threshold values refer to details of the selection: variable thresholds (V), hadronic isolation (H), and electromagnetic isolation (I). Dips in the rates are due to dead-time and spikes are caused by detector noise.
pdf, png
contact: KateWhalen, DanieleZanzi, HeatherRussell
Level-1 (L1) physics trigger rates as a function of time in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are rates of some representative multi-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing transverse energy (XE) and tau candidates (TAU). Numbers preceding the object name denote the multiplicity, while numbers following the object name denote the trigger threshold in GeV (note that MU11_2MU6 is a di-muon trigger). The letters following the threshold values refer to details of the selection: separation in eta and phi (DR), variable thresholds (V), hadronic isolation (H), and electromagnetic isolation (I). Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates increase periodically due to LHC luminosity re-optimisations; dips are due to dead-time and spikes are caused by detector noise.
pdf, png
contact: KateWhalen, DanieleZanzi, HeatherRussell
Level-1 (L1) physics trigger rates as a function of time in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are rates of some representative multi-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing transverse energy (XE) and tau candidates (TAU). Numbers preceding the object name denote the multiplicity, while numbers following the object name denote the trigger threshold in GeV (note that MU11_2MU6 is a di-muon trigger). The letters following the threshold values refer to details of the selection: separation in eta and phi (DR), variable thresholds (V), hadronic isolation (H), and electromagnetic isolation (I). The total L1 rate is also shown. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates increase periodically due to LHC luminosity re-optimisations; dips are due to dead-time and spikes are caused by detector noise.
pdf, png
contact: KateWhalen, DanieleZanzi, HeatherRussell
Level-1 (L1) physics trigger rates as a function of instantaneous luminosity in a fill taken in September 2018 with a peak luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ>=56. Presented are rates of some representative multi-object trigger items, which have not been prescaled. These trigger items are based on such objects as electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing transverse energy (XE) and tau candidates (TAU). Numbers preceding the object name denote the multiplicity, while numbers following the object name denote the trigger threshold in GeV (note that MU11_2MU6 is a di-muon trigger). The letters following the threshold values refer to details of the selection: separation in eta and phi (DR), variable thresholds (V), hadronic isolation (H), and electromagnetic isolation (I). Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates increase periodically due to LHC luminosity re-optimisations; dips are due to dead-time and spikes are caused by detector noise.
pdf, png
contact: KateWhalen, DanieleZanzi, HeatherRussell

Trigger rates and bandwidth for trigger-level analysis

Total level-1 (L1) trigger output rate as a function of instantaneous luminosity in a typical fill taken in September 2018. Also shown are the rates of the individual trigger items seeding the Trigger-Level Analysis (TLA) stream, without accounting for overlaps between the individual items. The letter “J” followed by number in the trigger name denotes a jet satisfying a given energy threshold in GeV. The suffix “DETA20-J50J” denotes a leading jet with ET > 50 GeV, separated from the sub-leading jet by η < 2.0. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The increase of the L1 rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers and so the prescales of these triggers are reduced or removed at certain luminosity points, as shown by the increases in rate at L = 1.2, 1.0, and 0.7 x 1034 cm-2s-1. This increases the total HLT output rate, but does not significantly affect the total output bandwidth due to the small size of TLA events. Other periodic rate increases are due to a change of prescales on other triggers (not shown) to optimise the bandwidth usage or LHC luminosity re-optimisations; dips are due to dead-time and spikes are caused by detector noise. 
eps, png, pdf
contact: AntonioBoveia, CaterinaDoglioni, CharlesWilliamKalderon, EmmaTolley, KateWhalen
Total level-1 (L1) trigger output rate as a function of time, with the LHC instantaneous luminosity overlaid, in a fill taken in September 2018 with a peak instantaneous luminosity of L = 2.0 x 1034 cm-2s-1 and a peak average number of interactions per crossing of <μ> = 56. Also shown are the rates of the individual trigger items seeding the Trigger-Level Analysis (TLA) stream, without accounting for overlaps between the individual items. The letter “J” followed by number in the trigger name denotes a jet satisfying a given energy threshold in GeV. The suffix “DETA20-J50J” denotes a leading jet with ET > 50 GeV, separated from the sub-leading jet by η < 2.0. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The increase of the L1 rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers and so the prescales of these triggers are reduced or removed at certain luminosity points, as shown by the increases in rate at L = 1.2, 1.0, and 0.7 x 1034 cm-2s-1. This increases the total HLT output rate, but does not significantly affect the total output bandwidth due to the small size of TLA events. 
png, pdf
contact: AntonioBoveia, CaterinaDoglioni, CharlesWilliamKalderon, EmmaTolley, KateWhalen
Output bandwidth of the High Level Trigger (HLT) as a function of time in an LHC fill taken in May 2018 with a peak luminosity of L = 1.76*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 54. Presented are the total HLT output bandwidth for all trigger streams (solid line) and only that for the Trigger Level Analysis (TLA) stream (dashed line). The total HLT output bandwidth includes data from all ATLAS trigger streams, including the TLA stream and the main physics stream which selects events that contain all physics objects and full detector information. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the TLA output bandwidth is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This first occurs when the instantaneous luminosity has dropped to L = 1.2*1034 cm-2s-1 (at 21:50), when these triggers are activated with a prescale of 12, and then a larger increase occurs at L = 1.0*1034 cm-2s-1 (at 23:50) when the prescale is removed.
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Output bandwidth of the High Level Trigger (HLT) as a function of time in an LHC fill taken in May 2018 with a peak luminosity of L = 1.76*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 54. Presented are the total HLT output bandwidth for all trigger streams (solid line) and only that for the Trigger Level Analysis (TLA) stream (dashed line). The total HLT output bandwidth includes data from all ATLAS trigger streams, including the TLA stream and the main physics stream which selects events that contain all physics objects and full detector information. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the TLA output bandwidth is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This first occurs when the instantaneous luminosity has dropped to L = 1.2*1034 cm-2s-1 (at 21:50), when these triggers are activated with a prescale of 12, and then a larger increase occurs at L = 1.0*1034 cm-2s-1 (at 23:50) when the prescale is removed.
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Output rate of the High Level Trigger (HLT) as a function of time in an LHC fill taken in May 2018 with a peak luminosity of L = 1.76*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 54. Presented are the total HLT output rate for all trigger streams (solid line) and only that for the Trigger Level Analysis (TLA) stream (dashed line). The total HLT output rate includes data from all ATLAS trigger streams, including the TLA stream and the main physics stream which selects events that contain all physics objects and full detector information. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the TLA HLT output rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This first occurs when the instantaneous luminosity has dropped to L = 1.2*1034 cm-2s-1 (at 21:50), when these triggers are activated with a prescale of 12, and then a larger increase occurs at L = 1.0*1034 cm-2s-1 (at 23:50) when the prescale is removed.
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Total L1 output rate as a function of time in an LHC fill taken in May 2018 with a peak luminosity of L = 1.76*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 54, with the LHC instantaneous luminosity overlaid. The L1 output rate includes events selected for data-taking of full events in the regular data stream and events selected for the Trigger Level Analysis (TLA) stream. . The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the L1 rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This first occurs when the instantaneous luminosity has dropped to L = 1.2*1034 cm-2s-1 (at 21:50), when these triggers are activated with a prescale of 12, and then a larger increase occurs at L = 1.0*1034 cm-2s-1 (at 23:50) when the prescale is removed.
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Total HLT output rate and bandwidth overlaid, as a function of time in an LHC fill taken in May 2018 with a peak luminosity of L = 1.76*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 54. The HLT rate and bandwidth includes events selected for data-taking of full events in the regular data stream and events selected for the Trigger Level Analysis (TLA) stream. . The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay of the output bandwidth is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the HLT output rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This first occurs when the instantaneous luminosity has dropped to L = 1.2*1034 cm-2s-1 (at 21:50), when these triggers are activated with a prescale of 12, and then a larger increase occurs at L = 1.0*1034 cm-2s-1 (at 23:50) when the prescale is removed.
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen

L1Topo Operation VBF 2018

Link to CDS: ATL-COM-DAQ-2018-173

The trigger rate is shown for the Level-1 trigger L1_MJJ-500-NFF as a function of the instantaneous luminosity for LHC fill 7139, taken in September 2018. The trigger is implemented in the Level-1 topological trigger where two lists sorted by the jet pT are formed, and each list can contain up to six entries. The first list includes all jets that have pT > 30 GeV and |η| < 3.1 while the second list includes all jets with pT > 20 GeV. The dijet mass MJJ is computed using one jet from each list and events with MJJ > 500 GeV pass the trigger requirement.
eps png pdf contact: BenCarlson, ChristopherHayes, AndrewAukerman
The trigger rate is shown for the Level-1 trigger L1_MJJ-500-NFF as a function of the mean number of simultaneous interactions per proton–proton bunch crossing averaged over all bunches circulating in the LHC for LHC fill 7139, taken in September 2018. The trigger is implemented in the Level-1 topological trigger where two lists sorted by the jet pT are formed, and each list can contain up to six entries. The first list includes all jets that have pT > 30 GeV and |η| < 3.1 while the second list includes all jets with pT > 20 GeV. The dijet mass MJJ is computed using one jet from each list and events with MJJ > 500 GeV pass the trigger requirement.
eps png pdf contact: BenCarlson, ChristopherHayes, AndrewAukerman
The trigger rate is shown for the high-level trigger HLT_j70_j50_0eta490_invm1100j70_dphi20_deta40_L1MJJ-500-NFF as a function of the instantaneous luminosity for LHC fill 7139, taken in September 2018. The high-level trigger requires at least two R=0.4 anti-kt jets, calibrated using only calorimeter information. One jet must have pT > 70 GeV and |η| < 3.2, while a second jet is required to have pT > 50 GeV. The dijet mass, Δφjj and Δηjj are formed from jet pairs and required to have Mjj > 1100 GeV, Δφjj < 2.0 and Δηjj >4.0.
eps png pdf contact: BenCarlson, ChristopherHayes, AndrewAukerman
The trigger rate is shown for the high-level trigger HLT_j70_j50_0eta490_invm1100j70_dphi20_deta40_L1MJJ-500-NFF as a function of the mean number of simultaneous interactions per proton–proton bunch crossing averaged over all bunches circulating in the LHC for LHC fill 7139, taken in September 2018. The high-level trigger requires at least two R=0.4 anti-kt jets, calibrated using only calorimeter information. One jet must have pT > 70 GeV and |η| < 3.2, while a second jet is required to have pT > 50 GeV. The dijet mass, Δφjj and Δηjj are formed from jet pairs and required to have Mjj > 1100 GeV, Δφjj < 2.0 and Δηjj >4.0.
eps png pdf contact: BenCarlson, ChristopherHayes, AndrewAukerman
The trigger efficiency is shown for the high-level trigger HLT_j70_j50_0eta490_invm1100j70_dphi20_deta40_L1MJJ-500-NFF as a function of the offline maximum dijet mass, Mjjmax. The efficiency is measured using events that are selected using a single muon trigger with a threshold of 27 GeV. In addition, the events are required to have at least two R=0.4 anti-kt jets, where one jet has pT > 90 GeV and |η| < 3.1, while another jet has pT > 80 GeV. The angular requirements Δφjj < 2.0 and Δηjj >4.0 are also applied offline, as in the trigger requirement.
eps png pdf contact: BenCarlson, ChristopherHayes, AndrewAukerman

2017 pp at 13TeV

Trigger rates and bandwidth for trigger-level analysis

Output bandwidth of the High Level Trigger (HLT) as a function of time in an LHC fill taken in October 2017 with a peak luminosity of L = 1.54*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 59. Presented are the total HLT output bandwidth for all trigger streams (solid line) and only that for the Trigger Level Analysis (TLA) stream (dashed line). The total HLT output bandwidth includes data from all ATLAS trigger streams, including the TLA stream and the main physics stream which selects events that contain all physics objects and full detector information. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the TLA output bandwidth is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This occurs when the instantaneous luminosity has dropped to L = 1.0*1034 cm-2s-1 (at 09:25).
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Output bandwidth of the High Level Trigger (HLT) as a function of time in an LHC fill taken in October 2017 with a peak luminosity of L = 1.54*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 59. Presented are the total HLT output bandwidth for all trigger streams (solid line) and only that for the Trigger Level Analysis (TLA) stream (dashed line). The total HLT output bandwidth includes data from all ATLAS trigger streams, including the TLA stream and the main physics stream which selects events that contain all physics objects and full detector information. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the TLA output bandwidth is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This occurs when the instantaneous luminosity has dropped to L = 1.0*1034 cm-2s-1 (at 09:25).
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Output rate of the High Level Trigger (HLT) as a function of time in an LHC fill taken in October 2017 with a peak luminosity of L = 1.54*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 59. Presented are the total HLT output rate for all trigger streams (solid line) and only that for the Trigger Level Analysis (TLA) stream (dashed line). The total HLT output rate includes data from all ATLAS trigger streams, including the TLA stream and the main physics stream which selects events that contain all physics objects and full detector information. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the TLA HLT output rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This occurs when the instantaneous luminosity has dropped to L = 1.0*1034 cm-2s-1 (at 09:25).
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Total L1 output rate as a function of time in an LHC fill taken in October 2017 with a peak luminosity of L = 1.54*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 59, with the LHC instantaneous luminosity overlaid. The L1 output rate includes events selected for data-taking of full events in the regular data stream and events selected for the Trigger Level Analysis (TLA) stream. The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the L1 rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This occurs when the instantaneous luminosity has dropped to L = 1.0*1034 cm-2s-1 (at 09:25).
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen
Total HLT output rate and bandwidth overlaid, as a function of time in an LHC fill taken in October 2017 with a peak luminosity of L = 1.54*1034 cm-2s-1 and a mean number of interactions per crossing peaking at 59. The HLT rate and bandwidth includes events selected for data-taking of full events in the regular data stream and events selected for the Trigger Level Analysis (TLA) stream. . The TLA stream only selects events containing jets and saves the HLT jets and limited event information, leading to a much smaller event size than in regular data-taking. The exponential decay of the output bandwidth is due to the decreasing luminosity during an LHC fill, while dips are due to deadtime. The increase of the HLT output rate is part of the end-of-fill strategy of the ATLAS trigger. At the end of the LHC fill, L1 and CPU resources are available to reconstruct and record additional events using lower-threshold TLA triggers. This increases the total HLT output rate, but does not affect the total output bandwidth significantly due to the small event size of TLA events. This occurs when the instantaneous luminosity has dropped to L = 1.0*1034 cm-2s-1 (at 09:25).
png eps contact: CaterinaDoglioni, CharlesWilliamKalderon, KateWhalen

Trigger Operations plots from Trigger Menu in 2017 Pub Note

Link to pub note: ATL-DAQ-PUB-2018-002

The average recording rate of the main physics data stream and the B-physics and light states data stream for each ATLAS pp run taken in 2017. The total average of all runs is indicated as a red dotted line, and the total average of the main physics stream is indicated as a blue dotted line.
png pdf
Trigger stream rates as a function of time in a fill taken in August 2017 with a peak luminosity of L = 1.7 × 1034 cm-2s-1 and a peak average interactions per crossing of <μ>=49.
png pdf
Output bandwidth at the HLT as a function of time in a fill taken in August 2017 with a peak luminosity of L = 1.7 × 1034 cm-2s-1 and a peak average interactions per crossing of <μ>=49. The monitoring stream is not reflected in the output bandwidth as the monitoring data is handled differently.
png pdf

Trigger Rates and bandwidth

Internal link: ATL-COM-DAQ-2017-076

Physics trigger group rates at the High Level Trigger (HLT) as a function of time in a fill taken in June 2017 with a peak luminosity of L = 1.53 * 1034 cm-2s-1 and a peak pile up of mu = 43. Presented are the rates of the individual trigger groups specific to trigger physics objects. Each of the groups contain single and multi triggers of the same object. Overlaps are only accounted for in the total Main Physics Stream rate. The B-physics and Light States (LS) triggers are mainly muon-based trigger algorithms; the majority of the triggers are written to the B-physics and LS stream and are thus not included here. The combined group represents multiple triggers of different objects, as combinations of electrons, muons, taus, jets and missing transverse energy. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage or LHC luminosity re-optimisations, dips are due to deadtime and spikes are caused by detector noise.
png pdf contact: HeatherRussell, MarkStockton, ElisabettaPianori
Physics trigger rates at the first trigger level (L1) as a function of time in a fill taken in June 2017 with a peak luminosity of L = 1.53 * 1034 cm-2s-1 and a peak pile up of mu = 43. Presented are some rates of representative triggers, which have not been prescaled, for single L1 trigger items. These triggers are based on electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing energy (XE) and tau candidates (TAU). The number in the trigger name denotes the trigger threshold in GeV or the multiplicity. The other text refers to details of the selection: varying thresholds (V), hadronic isolation (HI), isolation (I), and separation in eta and phi (DR). Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to LHC luminosity re-optimisations, dips are due to deadtime and spikes are caused by detector noise.
png pdf contact: HeatherRussell, MarkStockton, ElisabettaPianori
Physics trigger rates at the first trigger level (L1) as a function of time in a fill taken in June 2017 with a peak luminosity of L = 1.53 * 1034 cm-2s-1 and a peak pile up of mu = 43. Presented are some rates of representative triggers, which have not been prescaled, for multi L1 trigger items. These triggers are based on electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing energy (XE) and tau candidates (TAU). The number in the trigger name denotes the trigger threshold in GeV or the multiplicity. The other text refers to details of the selection: varying thresholds (V), hadronic isolation (HI), isolation (I), and separation in eta and phi (DR). Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to LHC luminosity re-optimisations, dips are due to deadtime and spikes are caused by detector noise.


png pdf contact: HeatherRussell, MarkStockton, ElisabettaPianori
Contribution of the various streams to the total output bandwidth at the High Level Trigger (HLT) for a fill taken in June 2017 with a peak luminosity of L = 1.53 * 1034 cm-2s-1 and a peak pile up of mu = 43. Presented are the main physics stream rate, containing all triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the express stream, for data quality monitoring; other minor streams with physics applications, such as zero-bias events; the trigger level analysis stream; and the detector calibration streams. Higher HLT output rates are achieved by partial Event Building (EB) of some dedicated streams.
png contact: HeatherRussell, MarkStockton, ElisabettaPianori
Contribution of the total rate of the various streams at the High Level Trigger (HLT) for a fill taken in June 2017 with a peak luminosity of L = 1.53 * 1034 cm-2s-1 and a peak pile up of mu = 43. Presented are the main physics stream rate, containing all triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; other minor streams with physics applications, such as zero-bias events; the express stream, for data quality monitoring; other minor streams with physics applications, such as zero-bias events; the trigger level analysis stream; the detector calibration streams; and the detector monitoring streams. Higher rates HLT output rates are achieved by partial Event Building (EB) of some dedicated streams.


png contact: HeatherRussell, MarkStockton, ElisabettaPianori
Physics trigger stream rates at the High Level Trigger (HLT) as a function of time in a fill taken in June 2017 with a peak luminosity of L = 1.53 * 1034 cm-2s-1 and a peak pile up of mu = 43. Presented are the main physics stream rate, containing all triggers for physics analyses; the B-physics and light states (LS) stream, containing triggers specific to B-physics analyses; the express stream, for data quality monitoring; other minor streams with physics applications, such as zero-bias events; the trigger level analysis stream; the detector calibration streams; and the detector monitoring streams. Higher rates HLT output rates are achieved by partial Event Building (EB) of some dedicated streams. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage or LHC luminosity reoptimisations, dips are due to deadtime and spikes are caused by detector noise.
png pdf contact: HeatherRussell, MarkStockton, ElisabettaPianori

L1Topo Commissioning

Link to CDS: ATL-COM-DAQ-2017-046

The rate of first trigger level (L1) item that selects two muons, each with transverse momentum above 6 GeV with (blue) and without (red) L1Topo additional requirement. The L1Topo requirement is that the L1 muons form an invariant mass 2 GeV < mμμ < 9 GeV, and have angular separation 0.2<ΔR<1.5. The rate is given as a function of the number of luminosity blocks, which on average correspond to 60 s, in a run taken in June 2017 with a peak luminosity of L = 7.9*1033 cm-2s-1 and an average pile up of μ = 46.4. The overall reduction of the rate due to the L1Topo requirement is approximately a factor of four.
png pdf contact: OlyaIgonkina, DavideGerbaudo

Link to CDS: ATL-COM-DAQ-2017-078

The ratio of a first level topological trigger (L1Topo) item rate and the associated first level trigger (L1) item rate. This measurement (red curve) is compared with predictions based on a linear pileup regression (green curve) for LAR-EM/EM20VH. LAR-EM is the EM50 trigger with additional angular requirements, 0<η<1.4 and 9π/16<φ<11π/16. EM50 (EM20VH) is a trigger for an electron or photon with a threshold of 50 GeV (20 GeV). This ratio provides a comprehensive metric to assess the performance of the L1Topo trigger system and is monitored in real time.
tiff contact: TaeMinHong, AndyAukerman
The ratio of a first level topological trigger (L1Topo) item rate and the associated first level trigger (L1) item rate. This measurement (red curve) is compared with predictions based on a linear pileup regression (green curve) for DY-BOX-2MU6/MU10. DY-BOX-2MU6 is the 2MU6 trigger with additional angular requirements, |Δη|<0.5 and |Δφ|<0.5. 2MU6 (MU10) is a trigger for two muons (one muon) each with a threshold of 6 GeV (10 GeV). This ratio provides a comprehensive metric to assess the performance of the L1Topo trigger system and is monitored in real time.
tiff contact: TaeMinHong, AndyAukerman

2016 pp at 13TeV

L1 Trigger Rate

Total Level 1 (L1) rate as a function of time throughout a fill taken in October 2016 with a peak luminosity of L = 1.31* 1034 cm-2s-1 and a peak pile up of μ= 42. At the start of the run the rate is artificially increased to a maximum of 97kHz by adding additional L1 total energy triggers. Subsequently the additional rate is removed and the rate follows an exponential decay with decreasing luminosity during an LHC fill. There are periodically increases due to changes of trigger prescales to optimise the output bandwidth usage and dips due to temporary detector deadtime at that specific moment of data taking.
png pdf contact: MarkStockton, ElisabettaPianori

Updated Trigger Rates

The average recording rate of the physics data streams for each of ATLAS p-p run in 2016 is shown. The average of the runs is also shown as a dashed line (1.03 kHz). This is an updated version of the one shown previously (see the section below); updated by including all the p-p runs in 2016.
png pdf contact: KunihiroNagano
The average recording rate of the physics data streams for each of ATLAS p-p run in 2016 is shown, separately for the "Main", "Exotic" and "B" physics streams. The average of the runs is also shown as a dashed line (1.03 kHz), as well as the average for the "Main" stream is also shown as a dashed line (0.9 kHz). Around this summer (June, July), the streaming configuration was changed such that "Exotic" and "B" physics streams were separated from "Main" stream.
png pdf contact: KunihiroNagano
Level-1 trigger rates online ( red) compared with predictions based on luminosity-scaling (green) for five algorithms noted in the plot. The downward spikes correspond to the luminosity optimization done by the LHC.
png pdf eps contact: TaeMinHong, AndrewToddAukerman
pdf:1 pdf:2 pdf:3 pdf:4 pdf:5
Total and main physics stream output bandwidth at the High Level Trigger (HLT) as a function of time throughout a fill taken in July 2016 with a peak luminosity of L = 1.02 * 1034 cm-2s-1 and a peak pile up of μ = 35. The difference in total and main physics stream output bandwidth arises from additional streams that are not shown here. The bandwidth follows an exponential decay with decreasing luminosity during an LHC fill and periodically increases due to changes of trigger prescales to optimise the bandwidth usage. The dips are due to the detector deadtime.
png pdf contact: MarkStockton, CatrinBernius
Contribution of the total output bandwidth of the various streams at the High Level Trigger (HLT) for a fill taken in July 2016 with a peak luminosity of L = 1.02 * 1034 cm-2s-1 and a peak pile up of μ = 35.
png pdf contact: MarkStockton, CatrinBernius
Total and individual stream rates at the High Level Trigger (HLT) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in July 2016 with a peak luminosity of L = 1.02 * 1034cm-2s-1 and an average pile up of μ = 24.2. Presented are the main physics stream rate containing all triggers for physics analyses, the express stream for calibration DQ monitoring, the trigger level analysis stream with partial event building, and the detector monitoring and calibration streams as staggered plot. Stream overlaps are only accounted for in the HLT total output rate. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to deadtime and spikes are caused by detector noise. Higher rates HLT output rates can be achieved by partial Event Building (EB) of some dedicated streams.
png pdf eps contact: MartinZurNedden
Physics trigger group rates at the High Level Trigger (HLT) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in July 2016 with a peak luminosity of L = 1.02 * 1034cm-2s-1 and an average pile up of μ = 24.2. Presented are the rate of the individual trigger groups specific for trigger physics objects. Overlaps are only accounted for in the total Main Physics Stream rate. In the b-jet as well in the tau group, the multi object triggers are contained. The B-physics triggers are mainly muon based trigger algorithms. The combined group represent multiple triggers of different objects, as combinations of electrons, muons, taus, jets and missing transverse energy. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to deadtime and spikes are caused by detector noise.
png pdf eps contact: MartinZurNedden
Physics trigger group rates at the first trigger level (L1) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in July 2016 with a peak luminosity of L = 1.02 * 1034 cm-2s-1 and a peak pile up of μ = 35. Presented are the rate of the individual L1 trigger groups specific for trigger physics objects at L1. Overlaps are only accounted for in the total L1 output rate. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to deadtime and spikes are caused by detector noise.
png pdf eps contact: MartinZurNedden

Other Operational plots

Data Quality Monitoring Display (DQMD) for the ATLAS Trigger System. Over 500 histograms produced by the High Level Trigger (HLT) algorithms are monitored during data-taking. An overview of distributions for electrons reconstructed at the HLT is shown here. An automatic Kolmogorov-Smirnov test compares the shapes of the data distributions (in black) to reference histograms, displayed by the dashed magenta lines. Other automatic DQ tests are available based on standard histogram analysis techniques. The green DQ flags indicate the distributions are shaped as expected. Other possible DQ flags are yellow for alarm and red for problem, indicating different levels of agreement between the online and reference distributions. Troubleshooting information and detailed descriptions are available to aid the shifter by selecting each individual histogram.
png pdf contact: JoanaMachadoMiguens

L1Topo Commissioning

Level 1 rates before prescale versus instantaneous luminosity for L1 di-tau chains. All chains require two isolated hadronic taus with ET > 20,12 GeV. ‘L1_TAU20IM_2TAU12IM_J25_2J20_3J12’ requires the presence of an additional jet with ET >25 GeV based on an object counting performed at CTP. In the ‘DR’ chains, the topological cut ΔR < 2.9 between the two taus is applied by the L1Topo hardware. In ‘DR-TAU20ITAU12I-J25’, the presence of a jet with ET >25 GeV and not overlapping in ΔR < 1 with the two taus is also required at L1Topo. Data have been collected in pp collisions at a center-of-mass energy of 13 TeV in 2016.
png pdf contact: DanieleZanzi
Efficiency of the L1Topo selection as function of the ΔR(τ,τ) between offline tau candidates in di-tau events. The selection implemented at L1 by the L1Topo hardware requires two isolated taus with ET > 20,12 GeV and ΔR < 2.9. The efficiency is computed in data events collected in 2016 and selected by a di-tau trigger with the same energy and isolation selections, but no ΔR cut. The L1Topo selection is fully efficient for di-tau events reconstructed offline with ΔR(τ,τ) < 2.5.
png pdf contact: DanieleZanzi
The rate of first trigger level (L1) item that selects two muons, each with transverse momentum above 6 GeV with (red) and without (black) L1Topo additional requirement. The L1Topo requirement is that L1 muons form an invariant mass between 2 and 9 GeV, and have opening angle between 0.2 and 1.5. The rate is given as a function of the number of luminosity blocks which correspond to on average 60 s per luminosity block in a fill taken in August 2016 with a peak luminosity of L = 2.90*1034 cm-2s-1 and an average pile up of μ = 31.9. The overall reduction of the rate due to L1Topo requirement is approximately four.
png pdf contact: OlyaIgonkina
Plot of invariant mass distribution for pairs of oppositely charged offline reconstructed muons, shown for events passing a nominal HLT trigger chain (triangles) and the equivalent HLT chain with additional L1Topo requirements (circles). The offline muon pairs are fit to a common vertex, using the inner detector track parameters. The two triggers chains each require two muons at L1, passing thresholds of pT > 6 GeV, which are confirmed at the HLT, with oppositely charged muons fit to a common vertex, using the inner detector track parameters, and with invariant mass requirements made to restrict events to the b-hadron invariant mass range. L1Topo makes additional requirements that L1 muons form an invariant mass between 2 and 9 GeV, and have opening angle between 0.2 and 1.5. The rate reduction at L1 achieved by L1Topo by approximately a factor of four, results in a reduction of efficiency of approximately 12% at the HLT due to the selection requirements and the difference in resolution of muon direction reconstruction at L1 and HLT.
png pdf contact: OlyaIgonkina

Older Trigger Rates + L1Topo Commissioning

The average recording rate of the physics data streams for each ATLAS run is shown. The average of the runs is also shown as a dashed line (1 kHz).
png eps pdf contact: KunihiroNagano
Physics trigger group rates at the first trigger level (L1) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in May 2016 with a peak luminosity of L = 3.56 1033 cm-2s-1 and an average pile up of μ = 21.7. Presented are the rate of the individual L1 trigger groups specific for trigger physics objects at L1. Overlaps are accounted for in the total output rate, but not in the individual groups leading to a higher recording rate compared to the total L1 output rate. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to dead time and spikes are caused by detector noise.
png pdf eps contact: MartinZurNedden
Total and individual stream rates at the High Level Trigger (HLT) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in May 2016 with a peak luminosity of L = 3.56 * 1033 cm-2s-1 and an average pile up of μ = 21.7. Presented are the main physics stream rate containing all triggers for physics analyses, the express stream for calibration DQ monitoring, the trigger level analysis stream with partial event building, and the detector monitoring and calibration streams as staggered plot. Stream overlaps are accounted for in the total output rate, but not in the individual streams leading to a slightly higher recording rate compared to the total HLT output rate. The overlaps are mainly between the trigger level analysis and the main physics streams. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to dead time and spikes are caused by detector noise.
png pdf eps contact: MartinZurNedden
Using a background data sample collected by calorimeter-based triggers, we calculate efficiency turn-on curves for two L1Topo triggers based on the HT algorithm as a function of the offline-reconstructed HT, the transverse energy sum of jets. HT150_J20_ETA31 (HT190_J15_ETA21) requires the transverse energy sum of jets with pT > 20 (15) GeV and pseudo-rapidity |eta| < 3.1 (2.1) to be above 150 (190) GeV. The full L1Calo stream of run 298595 is used. Events triggered by calorimeter-based triggers (not including topological triggers) are sent to this stream and contain information about the decision of topological triggers. The efficiency of these triggers with respect to the full sample of events is shown as a function of the offline reconstructed HT calculated using the appropriate selection of reconstructed jets.
png pdf eps contact: ImmaRiu
Trigger rates as a function of the instantaneous luminosity of two L1Topo triggers based on the HT algorithm, which computes the transverse energy sum of jets. HT150_J20_ETA31 (HT190_J15_ETA21) requires the transverse energy sum of jets with pT > 20 (15) GeV and pseudo-rapidity |eta| < 3.1 (2.1) to be above 150 (190) GeV. The online trigger rates are compared to the prediction obtained by running the trigger simulation on an un-biased data sample.
png pdf eps contact: KunihiroNagano and ImmaRiu

2015 pp at 13TeV

Updated Trigger Rates

Total and main physics stream output bandwidth at the High Level Trigger (HLT) as a function of time throughout a fill taken in October 2015 with a peak luminosity of L = 4.45 * 1034 cm-2s-1 and a peak pile up of μ = 14.7. The difference in total and main physics stream output bandwidth arises from additional streams that are not shown here. The bandwidth follows an exponential decay with decreasing luminosity during an LHC fill and periodically increases due to changes of trigger prescales to optimise the bandwidth usage. The dips are due to the detector deadtime.
png pdf contact: MarkStockton, CatrinBernius
Contribution of the total output bandwidth of the various streams at the High Level Trigger (HLT) for a fill taken in October 2015 with a peak luminosity of L = 4.45 * 1034 cm-2s-1 and a peak pile up of μ = 14.7.
png pdf contact: MarkStockton, CatrinBernius
Total and individual stream rates at the High Level Trigger (HLT) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033cm-2s-1 and an average pile up of μ = 15. Presented are the main physics stream rate containing all triggers for physics analyses, the express stream for calibration DQ monitoring, the trigger level analysis stream with partial event building, and the detector monitoring and calibration streams as staggered plot. Stream overlaps are only accounted for in the total HLT output rate. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to deadtime and spikes are caused by detector noise. Higher rates HLT output rates can be achieved by partial Event Building (EB) of some dedicated streams.
png pdf eps contact: MartinZurNedden
Physics trigger group rates at the High Level Trigger (HLT) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033cm-2s-1 and a peak pile up of μ = 14.7. Presented are the rate of the individual trigger groups specific for trigger physics objects. Overlaps are only accounted for in the total Main Physics Stream rate. In the b-jet as well in the tau group, the multi-object triggers are contained. The B-physics triggers are mainly muon- based trigger algorithms. The combined group represent multiple triggers of different objects, as combinations of electrons, muons, taus, jets and missing transverse energy. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to deadtime and spikes are caused by detector noise.
png pdf eps contact: MartinZurNedden
Physics trigger group rates at the first trigger level (L1) as a function of the number of luminosity blocks which correspond to on average 60s per luminosity block in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033cm-2s-1 and a peak pile up of μ = 14.7. Presented are the rate of the individual L1 trigger groups specific for trigger physics objects at L1. Overlaps are only accounted for in the total L1 output rate. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill. The rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to deadtime and spikes are caused by detector noise.
png pdf eps contact: MartinZurNedden
The main ATLAS triggers used for runs with peak luminosity L = 5 1033 cm-2s-1. The total rate corresponds to the full menu that includes many more triggers than what is listed in this table. Electron and tau identification is assumed to be of `medium' flavour, unless specified otherwise. b-jet identification is assumed to be of `tight' flavour, unless specified otherwise. For b-physics triggers, dedicated selections for Jpsi, Upsilon, B-mesons are applied. The typical offline cuts are only indicative.
pdf contact: AnnaSfyrla
The fixed frequency veto of the innermost pixel detector of ATLAS (IBL) protecting against irreparable damage due to resonant vibration of the wire bonds has a direct impact on the maximal tolerable rate at the first trigger level (L1). This limit depends on the number of colliding bunches in ATLAS and on the filling scheme of the LHC beams. This plot presents the simulated rate limits of the L1 trigger from the IBL for two different filling schemes and the expected maximal L1 rate from rate predictions. The steps in the latter indicate a change in the prescale strategy. The simulated rate limit is confirmed with experimental tests. The rate limitation was only critical for the lower luminosity phase, where the required physics L1 rate was higher than the limit imposed by the IBL veto. Tighter prescales were then used.
png pdf eps contact: MartinZurNedden and CatrinBernius

Other Operational plots

The ATLAS trigger system records complete information from online trigger processing with each accepted event. This information is then copied to Analysis Object Data (AOD) format which is created by offline reconstruction algorithms. In the new ATLAS analysis model adopted in 2015, AOD payload is reduced using analysis specific filters that keep minimum information necessary for each type of analysis. This filters include reduction in trigger payload by keeping only information necessary for matching particle candidates selected by a given analysis with particle candidates selected by the online trigger system. This reduction in the trigger payload is achieved using a navigation framework which associates selected trigger candidates with logical trigger requirements. This plot shows a typical size reduction in percent of the trigger navigation payload size for specific representative filters used by physics analyses. In each case information related to a broad range of triggers is saved.
eps contact: RyanWhite

Older Trigger Rates

Total and individual stream rates at the High Level Trigger as a function of the instantaneous luminosity in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033 cm-2s-1 and an average pile up of μ = 15. Presented are the total High Level Trigger output rate, the main physics stream rate containing all triggers for physics analyses, the trigger level analysis stream with partial event building, and the detector monitoring and calibration streams. Stream overlaps are accounted for in the total output rate.
png pdf eps contact: MartinZurNedden and CatrinBernius
Total and individual stream rates at the High Level Trigger as a function of the instantaneous luminosity in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033 cm-2s-1 and an average pile up of μ = 15. Presented are the main physics stream rate containing all triggers for physics analyses, the express stream for calibration DQ monitoring, the trigger level analysis stream with partial event building, and the detector monitoring and calibration streams as staggered plot. Stream overlaps are accounted for in the total output rate, but not in the individual streams leading to a slightly higher recording rate compared to the total HLT output rate. The overlaps are mainly between the trigger level analysis and the main physics streams.
png pdf eps contact: MartinZurNedden and CatrinBernius
Physics trigger group rates at the High Level Trigger as a function of the instantaneous luminosity in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033 cm-2s-1 and an average pile up of μ = 15. Presented are the rate of the individual trigger groups specific for trigger physics objects. Overlaps are accounted for in the total output rate, but not in the individual groups leading to a higher recording rate compared to the total HLT output rate. In the b-jet as well in the tau group, the multi-object triggers are contained. The B-physics triggers are manly muon-based trigger algorithms. The combined group represent multiple triggers of different objects, as combinations of electrons, muons, taus, jets and missing transverse energy.
png pdf eps contact: MartinZurNedden and CatrinBernius
Individual First Level Trigger rates as a function of the instantaneous luminosity in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033 cm-2s-1 and an average pile up of μ = 15. Presented are some representative first level trigger rates of single object triggers based on electromagnetic clusters (EM), muon candidates (MU), jet candidates (J), missing energy (XE) and tau candidates (TAU). The number in the t rigger name denotes the trigger threshold in GeV.
png pdf eps contact: MartinZurNedden and CatrinBernius
Individual First Level Trigger rates as a function of the instantaneous luminosity in a fill taken in October 2015 with a peak luminosity of L = 4.6 * 1033 cm-2s-1 and an average pile up of μ = 15. Presented are some representative first level trigger rates of multiple object triggers based on electromagnetic clusters (EM), muon candidates (MU), jet candidates (J) and tau candidates (TAU). The number after the trigger name denotes the trigger threshold in GeV, whereas the number before stands for the object multiplicity.
png pdf eps contact: MartinZurNedden and CatrinBernius

2013 pPb

Trigger Rates

Proton-lead run with a peak luminosity 1.1x10^29 cm-2s-1, integrated luminosity 1.5 nb-1. Output rates for L1, L2 and EF are shown as a function of time. A proton-lead interaction rate is 200kHz in the peak, thus it is substantially reduced by the L1 trigger. The L1 output rate is dominated by minimum bias (L1_MBTS_1_1) and high-multiplicity (L1_TE35) triggers and also various high-pt electron (L1_EM5), muon (L1_MU0) and jet (L1_J10) triggers. The EF output rate is kept constant at the level of 450Hz throughout the run. The stable-beam period is indicated by the solid histograms.
png contact: Iwona and Tomasz Bold
Proton-lead run with a peak luminosity 1.1x10^29 cm-2s-1, integrated luminosity 1.5 nb-1. Output rates for EF and EF physics and MinBias, Hard Probes (high-pT jets, muons, electrons and photons) and UPC (ultra peripheral collisions) data are shown as a function of time. The EF physics rate is a sum of MinBias, HardProbes and UPC data. The EF output rate on top of the EF physics rate includes calibration triggers. The EF physics rate is kept constant throughout the run at the level of 400Hz. The stable-beam period is indicated by the solid histograms.
png contact: Iwona and Tomasz Bold

2012 @ 8TeV

HLT algorithms execution caching

This plot illustrates effects of Region of Interest (RoI) based execution caching mechanism for trigger algorithms executing at the second level trigger (L2) of the ATLAS trigger. The caching mechanism reuses results from earlier executions of the same algorithm which is shared by other triggers present in trigger configuration. This caching works in the context of the same RoI and it records inputs and outputs of each algorithm execution. The plot shows the number of L2 algorithms executions per event (Nexec) for cases when the caching layer is enabled (black hatched) and when the caching layer is disabled (red unfilled). The comparison has been performed running trigger algorithms offline using a sample of events recorded in October of 2012 at a luminosity of ~1.5-6  1034 cm-2 s-1 and using the same trigger configuration as was used for the run when this data was recorded.
png pdf contact: TomaszBold
This plot illustrates effects of Region of Interest (RoI) based execution caching mechanism for trigger algorithms executing at the second level trigger (L2) of the ATLAS trigger. The caching mechanism reuses results from earlier executions of the same algorithm which is shared by other triggers present in trigger configuration. This caching works in the context of the same RoI and it records inputs and outputs of each algorithm execution. The plot shows the L2 execution time per event for cases when the caching layer is enabled (black hatched) and when the caching layer is disabled (red unfilled). The comparison has been performed running trigger algorithms offline using a sample of events recorded in October of 2012 at a luminosity of ~1.5-6  1034 cm-2 s-1 and using the same trigger configuration as was used for the run when this data was recorded.
png pdf contact: TomaszBold
This plot illustrates effects of Region of Interest (RoI) based execution caching mechanism for trigger algorithms executing at the second level trigger (L2) of the ATLAS trigger. The caching mechanism reuses results from earlier executions of the same algorithm which is shared by other triggers present in trigger configuration. This caching works in the context of the same RoI and it records inputs and outputs of each algorithm execution. The plot shows the L2 execution time per accepted event for cases when the caching layer is enabled (black hatched) and when the caching layer is disabled (red unfilled). The comparison has been performed running trigger algorithms offline using a sample of events recorded in October of 2012 at a luminosity of ~1.5-6  1034 cm-2 s-1 and using the same trigger configuration as was used for the run when this data was recorded.
png pdf contact: TomaszBold
This plot illustrates effects of Region of Interest (RoI) based execution caching mechanism for trigger algorithms executing at the second level trigger (L2) of the ATLAS trigger. The caching mechanism reuses results from earlier executions of the same algorithm which is shared by other triggers present in trigger configuration. This caching works in the context of the same RoI and it records inputs and outputs of each algorithm execution. The plot shows the L2 execution time per rejected event for cases when the caching layer is enabled (black hatched) and when the caching layer is disabled (red unfilled). The comparison has been performed running trigger algorithms offline using a sample of events recorded in October of 2012 at a luminosity of ~1.5-6  1034 cm-2 s-1 and using the same trigger configuration as was used for the run when this data was recorded.  The sample of input events used for these tests was recorded in a special 'enhanced bias' stream. This stream contains events that were selected using special triggers which accept the main L1 physics triggers without further HLT selection. These special triggers were then disabled for the offline HLT tests resulting in HLT rejection comparable to that obtained in typical physics data taking.
png pdf contact: TomaszBold
This plot illustrates effects of Region of Interest (RoI) based execution caching mechanism for trigger algorithms executing at the second level trigger (L2) of the ATLAS trigger. The caching mechanism reuses results from earlier executions of the same algorithm which is shared by other triggers present in trigger configuration. This caching works in the context of the same RoI and it records inputs and outputs of each algorithm execution. The plot shows the L2 result data size per event for cases when the caching layer is enabled (black hatched) and when the caching layer is disabled (red unfilled). The result data contains decision of the triggers and auxiliary data objects like cluster, tracks required during the evaluation of the trigger decision for the event. The comparison has been performed running trigger algorithms offline using a sample of events recorded in October of 2012 at a luminosity of ~1.5-6  1034 cm-2 s-1 and using the same trigger configuration as was used for the run when this data was recorded.
png pdf contact: TomaszBold

Trigger Rates

Event Filter stream recording rates per month, averaged over the periods for which the LHC declared stable beams. Special run periods such as VdM scans or ALFA runs have been left out.
png pdf contact: BrianPetersen

Level-1 rates for the forward-jet triggers up to 7.3x10^33 cm-2 s-1. The 2FJ15 is a trigger for two forward jets (|η|>3.2) with a 15 GeV threshold (one such object is called FJ15); EM14VH_FJ15 is a two-object trigger for an electron or a photon above 14 GeV and an FJ15 item; 3J15_FJ15 is a four-object trigger for three central jets (|η|<3.2) with a 15 GeV threshold and a FJ15; 2TAU8_TAU11I_EM10VH_FJ15 is effectively a three-object trigger for an isolated hadronically decaying tau lepton above 11 GeV, an electron or a photon above 10 GeV, and a FJ15; MU10_FJ15 is for a muon trigger with two-station coincidence above 10 GeV and an FJ15 item. The forward jets are very susceptible to pile-up effects. This is mitigated by the use of energy threshold requirements that are adjusted for a peak pile-up scenario that corresponds approximately to a luminosity of 5x10^33 cm-2 s-1. This leads to a change in the behavior of the triggers above that luminosity. The jet thresholds are quoted at the electromagnetic scale, so the response to hadronic energy is lower. The jet thresholds at 15 GeV are efficient for offline jets with ET above 50 GeV; the tau at 11 GeV are efficient above 40 GeV.


eps pdf contact: TaeMinHong
Level-1 rates for the lowest-threshold unprescaled single object triggers at 7.8x10^33 cm-2 s-1. EM18VH is a trigger for an electron or photon with a threshold at 18 GeV; MU15 is a trigger for a muon with a threshold near 15 GeV, and requires a three station coincidence in the barrel or endcap of the detector; TAU40 is a trigger for a hadronically decaying tau above 40 GeV; XE40 is a trigger for missing ET above 40 GeV; and J75 is trigger for a jet above 75 GeV.
eps pdf contact: TaeMinHong
Level-1 rates for the lowest-threshold unprescaled combined object triggers at 7.8x10^33 cm-2 s-1. 2EM10VH is a trigger for two electron or photons each with a threshold at 10 GeV; 2MU6 is a trigger for two muons each with a threshold near 6 GeV, and requires a two station coincidence in the barrel or the endcap of the detector; 2TAU11I_TAU15 is a trigger for two hadronically decaying tau leptons above 11 GeV with an isolation requirement, one of which is above 15 GeV; J50_XE35 is a trigger for a jet above 50 GeV and missing ET above 35 GeV; and 4J15 is trigger for four jets each above 15 GeV.


eps pdf contact: TaeMinHong
Event Filter stream recording rates from ATLAS run 209183 with a peak luminosity of 7.2 x 1033 cm-2s-1. Filtered for LHC stable beams and ATLAS ready. The x-axis has an arbitrary offset.
eps pdf contact: PeterRadloff
Level-1 Trigger cross-sections (rate/luminosity) for a selection of L1Calo-based trigger items. The left side of the figure corresponds to measurements from two 7TeV runs with 2011 nominal per-bunch luminosities, and colliding bunches delivered in bunch trains with 50ns spacing. The right side of the figure corresponds to a special high-luminosity 7TeV run with no bunch trains. The middle of the figure corresponds to an 8TeV run with 2012 nominal per-bunch luminosities and 50ns bunch trains. The falls in rate for XE50 and FJ75 triggers between 2011 and 2012 runs are due to trigger noise-cut increases in the forward regions of L1Calo. All other rate changes (increases) are due to the increased centre-of-mass energy. EM16 (EM30) is an electron-photon trigger with a threshold at 16 (30) GeV.

EM16VH is an electron-photon trigger with an hadronic layer energy veto and varied thresholds across the calorimeter, with typically 16 GeV thresholds. TAU15 is an hadronically-decaying tau trigger with threshold at 15 GeV. XE50 is a trigger for missing ET above 50 GeV at the EM scale. XE50_BGRP7 is an XE50 trigger with a veto on the first 3 bunches of a bunch train. J75 is a trigger for a central jet (|η|<3.2) with ET above 75 GeV. FJ75 is a trigger for a jet in the forward region (|η|>3.2) with ET above 75GeV. 4J10 is a trigger for four central jets with ET above 10GeV.

.png
png eps
png (without fixed-rate lines) eps (without fixed-rate lines)
contact: Will Buttinger
Level-1 Trigger cross-sections (rate/luminosity) for a selection of L1Muon-based trigger items. The left side of the figure corresponds to measurements from two 7TeV runs with 2011 nominal per-bunch luminosities, and colliding bunches delivered in bunch trains with 50ns spacing. The right side of the figure corresponds to a special high-luminosity 7TeV run with no bunch trains. The middle of the figure corresponds to an 8TeV run with 2012 nominal per-bunch luminosities and 50ns bunch trains. The falls in rate for MU4, MU10 and MU11 triggers in the left of the figure are due to a reoptimization of the associated trigger co-incidence windows. The fall in rate for MU10, MU11, MU15 and MU20 between 2011 and 2012 are due to further co-incidence window reoptimizations, and additional shielding installed in the region between the experiment’s barrel and endcap regions. Other trigger rate changes (increases) are due to the change in centre-of-mass energy. Rate decreases in the special high luminosity run are due to the long gap between colliding bunches.

MU0-MU10 are triggers for muons with pT above the indicated thresholds (in GeV). MU0 uses a full-open loosest coincidence, MU4-MU10 require co-incident hits in at least two RPC stations, or three TGC stations. MU11-MU20 require co-incident hits in all three stations of the trigger system in all regions. 2MU4 (2MU6) is a trigger for two muons with pT above 4 (6) GeV.

.png
png eps
png (without fixed-rate lines) eps (without fixed-rate lines)
contact: Will Buttinger
Trigger rates per bunch crossing, relative to the rate in bunch crossing ID 141, measured across one of the bunch trains of a typical 50ns-spacing LHC fill. L1_XE25 is a missing transverse energy trigger. L1_EM30 is a single electron/photon trigger requiring ET threshold of 30GeV. L1_MU10 is a single muon trigger with pT threshold of 10GeV. The L1_XE25 trigger has a significantly higher rate near the start of bunch trains due to the unbalanced overlaying of bipolar calorimeter signal shapes from neighbouring bunches. The lower rate in the first bunch for L1_MU10 is due to a contribution to rate in later bunches from delayed hits in the muon trigger chambers. .png
png eps
contact: Will Buttinger
Data trigger output and recording rate at ATLAS at a luminosity of 6.4x10e33. Event filter recording rate includes delayed streams. (Internal: LHC Fill 2686)
eps pdf contact: PeterRadloff
Event Filter stream recording rates from 2012. Filtered for LHC stable beams and ATLAS ready.
eps pdf contact: PeterRadloff
A snapshot of the TRP (Trigger Rate Presenter) monitoring display as seen by a shifter in the ATLAS control room. Top: output rates of the three levels (L1, L2, EF) of the trigger system. Bottom: EF recording and streaming rates for the main physics, calibration, cosmic and express streams; due to overlaps between inclusive streams the EF recording rate exceeds its output rate. Common features to all rates are their exponential decay with decreasing luminosity during an LHC fill that lasted ~20 hours, within their overall decay the rates periodically increase due to change of prescales to optimise the bandwidth usage, dips are due to deadtime and spikes are caused by detector noise.
eps pdf contact: IvanaHristova

2011 HI @ 2.76TeV

Trigger Rates

Heavy ion run with a peak luminosity 3.5x10^26 cm-2s-1, integrated luminosity 5.1 μb-1 Output rates for L1, L2 and EF are shown as a function of time. The L1 rate is dominated by a rate of L1_ZDC (an OR between triggers from side A and C), L1_ZDC_A_C (an AND between triggers from sides A and C) and a random trigger which is filtered by L2 to provide an unbiased sample for trigger efficiency studies. L1_ZDC_A_C is a minimum bias trigger for heavy ion collisions. The EF output rate is kept constant at the level of 200Hz throughout the run.
png pdf contact: Iwona and Tomasz Bold
Heavy ion run with a peak luminosity 3.5x10^26 cm-2s-1, integrated luminosity 5.1 μb-1 Output!rates for EF and EF physics and MinBias, Hard!Probes (high,pT jets, muons, electrons and photons) and UPC (ultra peripheral collisions) data are shown as a function of time. The EF physics rate is a sum of MinBias, HardProbes and UPC data. The EF output rate on top of the EF physics rate includes calibration triggers. The EF physics rate is kept constant throughout the run at the level of 200Hz. Bandwidth is filled up with minimum bias events once a hard probes rate goes down as the luminosity drops.
png pdf contact: Iwona and Tomasz Bold
Heavy ion run with a peak luminosity 3.5x10^26 cm-2s-1, integrated luminosity 5.1 μb-1 Output bandwidth for three physics data streams: MinBias, HardProbes (high pT jets, muons, electrons and photons) and UPC (ultra-peripheral collisions) is shown as a function of time. The EF physics bandwidth is kept constant throughout the run at the level of 500MB/s. Bandwidth is filled up with minimum bias events once a HardProbes rate goes down as the luminosity drops.
png pdf contact: Iwona and Tomasz Bold

2011 @ 7TeV

Average Stream Rates and Sizes, Trigger Rates

Event Filter stream recording rates, averaged over the periods for which the LHC declared stable beams.
png pdf contact: BrianPetersen
Data stream rates and sizes, separated into physics and calibration data for a given ATLAS run. This illustrates the ATLAS data composition in terms of number of taken events and size of the data in the data pipes and on tape. The express stream is a physics stream with expedited processing. (internal information: run 180636, 2.9pb-1, instantaneous luminosity 7.4x10e32)
eps pdf contact: JoergStelzer
Data trigger output and recording rate at ATLAS at a luminosity of 3.2x10e32. (Internal: LHC Fill 2178)
eps pdf contact: PeterRadloff
Trigger rates as a function of the instantaneous luminosity measured of the year 2011. Displayed are unprescaled primary triggers that are representative for the different signatures.
eps pdf contact: JoergStelzer
Level-1 rates for the lowest-threshold unprescaled single object triggers at 3x10^33 cm-2 s-1. EM16VH is a trigger for an electron-photon with a threshold at 16 GeV; MU11 is a trigger for a muon with a threshold near 10 GeV, and requires a three station coincidence both in the barrel and the endcap parts of the detector; TAU30 is a trigger for a hadronically decaying tau above 30 GeV; XE50 is a trigger for missing ET above 50 GeV at the EM scale; and J75 is trigger for a jet above 75 GeV.
eps pdf contact: TaeMinHong
Level-1 trigger cross-sections (rate / luminosity) for a Oct. 22, 2011 run with nominal luminosity per bunch and 12 bunch trains and a special Oct. 25, 2011 run with higher luminosity per bunch and no bunch trains. EM16VH is a trigger for an electron-photon with a threshold at 16 GeV; MU11 is a trigger for a muon with a threshold near 10 GeV, and requires a three station coincidence both in the barrel and the endcap parts of the detector; TAU30 is a trigger for a hadronically decaying tau above 30 GeV; and J75 is trigger for a jet above 75 GeV.
eps pdf contact: TaeMinHong
Level-1 trigger cross-sections (rate / luminosity) for a Oct. 22, 2011 run with nominal luminosity per bunch and 12 bunch trains and a special Oct. 25, 2011 run with higher luminosity per bunch and no bunch trains. 2TAU8_EM10VH is a trigger for two hadronically decaying taus each above 8 GeV with at least one of them passing the electron-photon trigger with a threshold at 10 GeV; 2TAU11 is a trigger for two hadronically decaying taus each above 11 GeV; 2MU4 is a trigger for two muons each above 4 GeV; 2EM12 is a trigger for two electron-photons each above 12 GeV; and 4J10 is a trigger for four jets each above 10 GeV.
eps pdf contact: TaeMinHong
Level-1 trigger cross-sections (rate / luminosity) for a Oct. 22, 2011 run with nominal luminosity per bunch and 12 bunch trains and a special Oct. 25, 2011 run with higher luminosity per bunch and no bunch trains. 4J10 is a trigger for four jets each above 10 GeV; 4J15 for four jets each above 15 GeV; 4J20 for four jets each above 20 GeV; and J75 for one jet above 75 GeV.
eps pdf contact: TaeMinHong
Event Filter stream recording rates from 2011. Filtered for LHC stable beams and ATLAS ready.
eps pdf contact: PeterRadloff
Rates from 2011 7-TeV data for single-jet triggers at L1, where |eta_jet| < 3.2. The minimum jet ET, measured in GeV at the electromagnetic calibration scale, is indicated in the trigger label. The rates are before applying pre-scales.
eps contact: FrancescoRubbo
Rates from 2011 7-TeV data for multi-jet triggers at L1, where |eta_jet| < 3.2. The jet multiplicity and the minimum jet ET, measured in GeV at the electromagnetic calibration scale, are indicated in the trigger label. The rates are before applying pre-scales.
eps contact: FrancescoRubbo
Rates from 2011 7-TeV data for single-jet triggers at L2, where |eta_jet| < 3.2. The minimum jet ET, measured in GeV at the hadronic calibration scale, is indicated in the trigger label. The rates are before applying pre-scales.
eps contact: FrancescoRubbo
Rates from 2011 7-TeV data for multi-jet triggers at L2, where |eta_jet| < 3.2. The jet multiplicity and the minimum jet ET, measured in GeV at the hadronic calibration scale, are indicated in the trigger label. The rates are before applying pre-scales.
eps contact: FrancescoRubbo
Rates from 2011 7-TeV data for single-jet triggers at EF, where |eta_jet| < 3.2. The minimum jet ET, measured in GeV at the hadronic calibration scale, is indicated in the trigger label. The rates are before applying pre-scales.
eps contact: FrancescoRubbo
Rates from 2011 7-TeV data for multi-jet triggers at EF, where |eta_jet| < 3.2. The jet multiplicity and the minimum jet ET, measured in GeV at the hadronic calibration scale, are indicated in the trigger label. The two triggers with six-jet multiplicity are seeded by two different L1 requirements. The rates are before applying pre-scales.
eps contact: FrancescoRubbo

Central Trigger Operations and Monitoring

HLT Timing Monitoring

Average time to run the level-2 inner-detector scan algorithm for electron tracking per region of interest (candidate electron from level-1) as a function of the mean number of collisions per crossing.
eps pdf contact: DougSchaefer
Average time to run the level-2 silicon track algorithm for hadronically decaying tau tracking per region of interest (candidate tau from level-1) as a function of the mean number of collisions per crossing.
eps pdf contact: DougSchaefer

Level-1 Accepts Timing Monitoring

The number of bunch crossings (BC) between two level-1 accepts (L1A) for the run 186275. The lower histogram is enlarged view of the upper histogram. They show no new L1A within the simple dead time which is set as 5 BC. A wave structure is originated from a bunch structure in which bunches are filled every other BC and bunch trains consist of about 80 BC.
png eps

png eps
The number of L1A in 416 BC for run 188902. The complex dead time is set to allow 8 L1A in the 416 BC.
png eps

LHC bunch timing measured in ATLAS BPTX

Beam phase during spring 2010-summer 2011
The time difference between the LHC bunch arrival time and the LHC RF clock during spring 2010-summer 2011. The bunch arrival times are measured by the beam timing pick-ups (BPTX) for beam 1 and beam 2. The phase includes an arbitrary offset to tune the phase to zero for a particular reference point. A fine-delay of 2ns was applied to the clock on 25 June and 6 July 2010 in order to shift the beam phase closer to zero. From July 2010 the beam phase is being kept constant within +/- 500 ps. To adjust the timing signals with an even higher precision (10ps) a new module (CORDE) was installed during the shutdown in 2010.

png eps
LHC clock drift during spring 2010-summer 2011
The time difference between the LHC bunch arrival time and the LHC RF clock during spring 2010-summer 2011. The bunch arrival times are measured by the beam timing pick-ups (BPTX) for beam 1 and beam 2. The phase includes an arbitrary offset to tune the phase to zero for a particular reference point. The beam phase shown in this graph excludes a clock fine-delay that was applied in hardware in order to correct for long-term time drifts. The open circles show a direct propagation delay measurement of an optical signal through spare clock fibers that cover a fraction of the full 13.3 km long clock fiber distance between the RF (Point 4) and Point 1. An appropriate scale factor and shift has been applied to match the curve with the beam phase measurements. The good agreement within 300ps indicates that the long-term drift of the beam phase is due to a drift of the propagation delay in the optical fiber.

png eps

CTP monitoring

Deadtime of different trigger items
This plot shows the Level-1 deadtime fractions for the trigger items L1_MU10, L1_EM14 and L1_TE180 for a long run on 5th June 2011. The deadtime fraction is the ratio of events which were triggered but vetoed to all triggered events. The spikes are due to a DAQ /sub-detector veto. The difference of the deadtime fractions can be explained by a bunch crossing dependent trigger response (see next plot).

gif eps contact: GabrielAnders
Bunch crossing dependent trigger response
This plot shows the Trigger After Prescale (TAP) rates for the trigger items L1_MU10, L1_EM14 and L1_TE180 in a bunch train. The rates are scaled so that they fit well on one plot. It can be seen that L1_TE180 fires much more often for the first two bunches in a train than for the rest (due to LAr pulse shape). The first two bunches in a train see a low Simple deadtime and for this reason the overall deadtime of L1_TE180 is decreased. The variation of the L1_EM14 rate can be explained by the varying luminosity of the bunch crossings. The increased L1_MU10 rate for the second bunch can be explained by late muon triggers. The vetoing of late muon triggers due to the Simple deadtime enhances the overall L1_MU10 deadtime

gif eps contact: GabrielAnders
Bunch crossing dependent deadtime
The left part of the plot shows the contributions of Simple and Sub-detector/DAQ deadtime to the total deadtime fraction for different bunches in a train. The right part of the plot shows the distribution of the deadtime fractions for all bunch crossings. The three peak structure originates in the Simple deadtime: The first bunch in a train sees no Simple deadtime, the second bunch sees the Simple deadtime of the first bunch. Following bunches in a train see the deadtime of the previous two bunches.

gif eps contact: GabrielAnders

2010 @ 7TeV

Operations and Monitoring

CTP Monitoring

Shows the busy fraction for each detector along with dead-time settings. This is the typical busy fraction observed during running (Ignore ALFA as not installed).
png
An example of the timing and rate of a trigger for all BCIDs. Here this is the XE35 trigger. From the upper plot we see that it is well timed in. the lower plot illustrates that it has a similar rate for all BCIDs with paired (colliding) bunches.
png
Shows the busy fraction as a function of time for the different sub-detectors (Ignore ALFA as not installed). The stable beam periods are shown, so the majority of the high busy rates did not affect data taking.
png
Shows the rate as a function of time for muon triggers. The number gives the pT cut (in GeV) and MU0_COMM applies no geometrical constraint (road) so only requires a time constraint. Clearly shows the rate reducing as the luminosity lowers during the run. Also shows the gap in rate during the two periods of stable beams.
png

HLT Timing and Resource Monitoring

Example of resource monitoring in the HLT (Oct 2010)
Times for an arbitrary algorithm from an arbitrary run and conditions, as recorded by the HLT resource monitoring software tool, which samples detailed timing data from the instrumented HLT steering framework and records it to a special calibration data stream. The time is broken down into CPU time needed to run the algorithm and the time to retrieve region-of-interest data over the LVL2 network. The aim of this plot is purely to illustrate the capabilities of the HLT to collect detailed monitoring of times information. Times are per event.

(Internal note: from run 166466, 8 Oct 2010, TrigIDScan_Tau algorithm. This is a more representative distribution of resource usage than the one of calorimeter cell clustering algorithm)


eps pdf contact: JoergStelzer
L2 total time online measurement
L2 total time online measurement that includes both processing and data collection times. The average time shown at the plot corresponds to the histogram with the full time range (up to 2s). (Internal note: plot taken from the expert-monitoring root file corresponding to run number 166658.)

eps contact: Imma.Riu
Example of resource monitoring in the HLT
Times for an arbitrary algorithm (in this case LVL2 calorimeter cell clustering) from an arbitrary run and conditions, as recorded by the HLT resource monitoring software tool, which samples detailed timing data from the instrumented HLT steering framework and records it to a special calibration data stream. The time is broken down into CPU time running the algorithm and the time to retrieve region-of-interest data over the LVL2 network. The aim of this plot is purely to illustrate the detailed monitoring of times within the HLT that is possible. Times are per event. (Internal note: from run 158116, 26 Jun 2010.)
image missing
png eps Contact: SimonGeorge

Online operations

Example of LVL2 trigger prescale changes during a run.
Here the L2_vtxbeamspot_activeTE_peb trigger (an arbitrary choice to illustrate the point) is initially disabled for the ‘warm start’ period, then enabled at luminosity block (LB) 255 when LHC stable beams are declared. As the beam intensity falls, the input rate drops by almost a factor of 4. The prescale factor is reduced, without restarting the run, in a few steps to keep the output rate between about 6 to 12 Hz. A LB is a time interval (~1 min) during which luminosity and conditions are considered to be approximately constant, and the smallest unit of data that can be declared good or bad for physics analysis. Prescale changes are therefore made on LB boundaries. (Internal note: run 153565.)
image missing
png eps Contact: SimonGeorge

Online monitored trigger rates

Output trigger rates at L1, L2 and Event Filter (EF) as a function of time for the run 167607. This run had the highest peak luminosity during 2010 pp collision. The dip observed at ~8:00 is due to holding the trigger for about a minute while resynchronizing the LAr readout. Instantaneous luminosities at the beginning and the end of the run are indicated.
png eps pdf

The EF output rates of four main physics streams (Muons,JetTauEtmiss, Egamma and MinBias) as a function of time for the run 167607. Several step-wise jumps of the rate are caused by the change of the prescale set. The dip observed at ~8:00 is due to holding the trigger for about a minute while resynchronizing the LAr readout. Instantaneous luminosities at the beginning and the end of the run are indicated.


png eps pdf

EF output rates of the lowest threshold unprescaled triggers for various signatures as a function of time for the run 167607. Plotted rates are the rates from individual triggers. The dip observed at ~8:00 is due to holding the trigger for about a minute while resynchronizing the LAr readout. Instantaneous luminosities at the beginning and the end of the run are indicated.


png eps pdf

Figure shows the online monitoried rates as a function of time for the first 7 TeV data run in ATLAS. Shown are the L1 rates, including enabling active HLT rejection for minimum-bias chains


png

Figure shows the online monitored rates as a function of time for 7 TeV data run with egamma rejection enabled for the lowest-EM thresholds (EM2, EM3) in ATLAS. Rates labeled with "out" are written to tape. The "L1 all" rate also includes high-rate LVL1 items for MinBias which are further reduced by the MinBias rejection. For HLT output only example streams are shown therefore, their sum does not account for "EF out". Bumps and dips in "L1 out" and "EF out" correspond to time when prescale values were changed. "EF Electron out" shows an events rate selected by e3_loose.


png

L1 trigger rates in 7 TeV data as a function of luminosity

Unprescaled L1 rates as a function of the instantaneous luminosity for electromagnetic triggers (ET thresholds of 2 GeV, 3 GeV and 5 GeV), muon triggers (no pT threshold and pT threshold of 6 GeV), a tau trigger (ET threshold of 5 GeV), a jet trigger (ET threshold of 5 GeV) and a trigger requiring a single hit in one of the minimum bias trigger scintillators mounted on each side of the experiment (MBTS_1). The MBTS_1 rate is scaled down by a factor of 20. Each dot represents a measurement in a time interval of about two minutes taken in runs with two colliding bunches (nb=2) in June 2010. While the electromagnetic, muon, tau and jet trigger rates show a nicely linear behavior, the MBTS rate saturates as it approaches two times the LHC revolution frequency (nb*fLHC~22 kHz) due to pile-up. L1RateAll.png
png, eps pdf
Unprescaled L1 rates as a function of the instantaneous luminosity for electromagnetic triggers (ET thresholds of 2 GeV, 3 GeV and 5 GeV), muon triggers (no pT threshold and pT threshold of 6 GeV), a tau trigger (ET threshold of 5 GeV), a jet trigger (ET threshold of 5 GeV) and a trigger requiring a single hit in one of the minimum bias trigger scintillators mounted on each side of the experiment (MBTS_1). The MBTS_1 rate is scaled down by a factor of 20. Each dot represents a measurement in a time interval of about two minutes taken in runs with two colliding bunches (nb=2) in June 2010. While the electromagnetic, muon, tau and jet trigger rates show a nicely linear behavior, the MBTS rate saturates as it approaches two times the LHC revolution frequency (nb*fLHC~22 kHz) due to pile-up. L1RateAllLog.png
png, eps, pdf
Unprescaled L1 rates of the electromagnetic trigger as a function of the instantaneous luminosity as measured from April,23 to June,30 2010 for ET thresholds of 2 GeV, 3 GeV and 5 GeV. Each dot represents a measurement in a time interval of about two minutes. The results of linear fits are shown for illustration. The L1 rate of the electromagnetic trigger is largely dominated by QCD background. L1RateEM.png
png, eps, pdf
Unprescaled L1 rates as a function of the instantaneous luminosity for a trigger requiring a single hit in one of the minimum bias trigger scintillators mounted on each side of the experiment (MBTS_1). Each dot represents a measurement in a time interval of about two minutes in runs taken in June and July 2010 with two (nb=2, red), respectively four (nb=4, blue) colliding bunches. The measurement of the inst. luminosity is already corrected for pile-up effects. As expected, the MBTS rates saturate due to pile-up as they approach two times, respectively four times the LHC revolution frequency, i.e. at nb*fLHC~22 kHz and nb*fLHC~44 kHz. L1RateMBTS.png
png, eps, pdf

Unprescaled L1 rates (red, blue) as a function of the instantaneous luminosity for a trigger requiring a single hit in one of the minimum bias trigger scintillators mounted on each side of the experiment (MBTS_1). Each dot represents a measurement in a time interval of about two minutes in runs taken in June and July 2010 with two (nb=2, red), respectively four (nb=4, blue) colliding bunches. The measurement of the inst. luminosity is already corrected for pile-up effects. As expected, the MBTS rates saturate due to pile-up as they approach two times, respectively four times the LHC revolution frequency, i.e. at nb*fLHC~22 kHz and nb*fLHC~44 kHz. In addition, the rate of interactions (IA, grey) after unfolding the pile-up contribution is shown featuring a nicely linear behavior. This demonstrates the stability of the MBTS trigger with respect to the luminosity detector LUCID.
L1RateMBTScor.png
png, eps, pdf, Contact: Johannes.Haller
L1 muon trigger rates as a function of luminosity
Measured L1 trigger rates as a function of the instantaneous luminosity for different L1 thresholds. The rates are fitted with a function taking into account the luminosity and filling scheme dependent part of the trigger rate. The steps are due to different filling schemes changing the contribution of cosmic radiation to the trigger rate.

jpg pdf
Predicted L1 muon trigger rates as a function of luminosity
Trigger rate evolution as a function of the instantaneous luminosity for L1 and different Event Filter (EF) triggers. Shown are single muon triggers at 0, 4 (for spectrometer only “MS” and combined), 6 and 10 GeV, and the di-muon trigger at 4 GeV. Predictions are based on rates measured with data.

jpg pdf

Rate and lumi evolution during run 167776

Rate for the 2MUL1_l2j30_HV trigger as function of time for run 167776 (see ATL-PHYSPUB-2009-082 for details). jpg
Rate for the l2j30_Trackless_HV trigger as function of time for run 167776 (see ATL-PHYSPUB-2009-082 for details). jpg
Rate for the j35_L1TAU_HV_jetNoEF trigger as function of time for run 167776 (see ATL-PHYSPUB-2009-082 for details).
jpg
Instantaneous luminosity for run 167776.
jpg

Central Trigger

LHC bunch timing measured in ATLAS BPTX

Bunch timing measured by the ATLAS beam pick-up detectors on March 30, 2010
The difference in time of arrival of bunch number 1 is shown in green. The time difference is constant at around 40ps +/- 20ps (RMS). The individual bunch timing, the time of arrival of bunch number 1 with respect to the LHC 40 MHz clock, is shown in blue/red respectively for beam 1 and 2. An expected shift of -200ps is observed during the acceleration period from 0.45 to 3.5 TeV.

jpg png eps pdf
Beam phase during spring/summer 2010
The time difference between the LHC bunch arrival time and the LHC RF clock during spring/summer 2010. The bunch arrival times are measured by the beam timing pick-ups (BPTX) for beam 1 and beam 2. The phase includes an arbitrary offset to tune the phase to zero for a particular reference point. A fine-delay of 2ns was applied to the clock on 25 June and 6 July 2010 in order to shift the beam phase closer to zero. From July 2010 the beam phase is being kept constant within +/- 500 ps.

png
LHC clock drift during spring/summer 2010
The time difference between the LHC bunch arrival time and the LHC RF clock. The bunch arrival times are measured by the beam timing pick-ups (BPTX) for beam 1 and beam 2. The phase includes an arbitrary offset to tune the phase to zero for a particular reference point. The beam phase shown in this graph excludes a clock fine-delay that was applied in hardware in order to correct for long-term time drifts. The open circles show a direct propagation delay measurement of an optical signal through spare clock fibers that cover a fraction of the full 13.3 km long clock fiber distance between the RF (Point 4) and Point 1. An appropriate scale factor and shift has been applied to match the curve with the beam phase measurements. The good agreement within 300ps indicates that the long-term drift of the beam phase is due to a drift of the propagation delay in the optical fiber.

png

BPTX oscilloscope pictures

Scope traces of the beam pickup system (BPTX) from 23 November 2009, just before the phase adjustment of the LHC RF team. The beam 1 signal is shown in blue/turquoise, beam 2 in red. The green signal is the 25-ns clock. The phase difference between the two beams in ATLAS is the peak-to-peak time difference, 900 ps. This corresponds to a vertex shift in z of 13.5 cm.
png
Scope traces of the beam pickup system (BPTX) from 23 November 2009, just after the phase adjustment of the LHC RF team. Again the beam 1 signal is shown in blue/turquoise, beam 2 in red. The green signal is the 25-ns clock. The effect of the cogging can be clearly seen, the peaks lie on top of each other, beams are crossing at the center of ATLAS. The mean track z0 position was consistently found to have moved as well in the z direction, by 12 cm, as expected (see here).
png

2009 Data

Data taking efficiency

Data taking efficiency December 2009
The Data Taking Efficiency defined as the ratio of the running time during beam time to beam time is shown. The running time incorporates the dead time fraction during each Luminosity Block reported by the Central Trigger Processor. The beam time is defined by the presence of two circulating beams independent of their stability condition. Each data point corresponds to an average efficiency calculated for a period of 1 hour and a negative value indicates a period of no beam. Reasons for lower efficiency are stop of the run during beam time to work on a subsystem, unexpected arrival of the beam and possible trigger holds due to a sub-system issuing busy for a brief period of time.

png pdf

Trigger Rates

MBTS and total rate
Figure shows the online monitored L1 rates as a function of time for three primary MBTS triggers used during collisions: MBTS_1 (black circles), MBTS_2 (red circles), MBTS_1_1 (green triangles). There are 16 scintillators on each side covering a rapidity interval of 2.09 < |eta| < 3.84. MBTS_1 requires one hit in any of the 2x16 scintillators, MBTS_2 requires 2 hits and MBTS_1_1 requires at least 1 hit in each side. The total EF rate to disk storage is also shown. The trigger rates are observed to correlate with the LHC beam intensities that were recorded during the same period.

png
L1 and HLT trigger rates for a typical run with stable beam flag. Also shown are a collision trigger at L1, requiring hits on both the A and the C side of the minimum bias scintillator counters and filled bunches for both beams. The line labled L2 Inner Detector activity represents a filtering algorithm at the L2 trigger, which accepts events based on space point counts in the Inner Detector. This L2 algorithm receives 5% of all filled bunches as input from L1. Assuming both the L1 collision trigger and the space point counting are highly efficient for collision events, the difference in the two lines should reflect this fraction, even though the acceptance of both triggers is different. The moment the L2 algorithm is enabled is clearly visible as the jump of output L1 rate, and the start of event rate on the L2 line. The dips in HLT and L1 output rates just before this moment are due to the short pause needed to change trigger setup. The HLT output rate (which represents the rate of events recorded to disc) does not visibly change, as it is dominated by a constant rate of monitor triggers. The slide-jpg file contains some explanations added to the picture.
png pdf slide-jpg slide-pptx
L1 rates for run 141811
Figure shows the average online monitored L1 trigger rates for all triggers that had an average rate in excess of 0.1 Hz, demonstrating that a number of trigger are firing at the L1 trigger with some rate, exercising the trigger system. Note the absolute values are as monitored online, and have not been corrected for beam setup or instant luminosity, and will therefore differ from run to run.

png pdf
L1 rates for run 141811
Figure shows the average online monitored L2 trigger rates for all triggers that had an average rate in excess of 0.1 Hz, demonstrating that a number of triggers are firing at the high level trigger with some rate, exercising the trigger system. Note the absolute values are as monitored online, and have not been corrected for beam setup or instant luminosity, and will therefore differ from run to run.

png pdf


Major updates:
-- JoergStelzer - 04-Jun-2011 -- DavideGerbaudo - 09-Jan-2013

Responsible: JoergStelzer
Subject: public

Topic attachments
I Attachment History Action Size Date Who Comment
Unknown file formateps 2011_streams_quarter_day.eps r1 manage 72.7 K 2012-06-01 - 16:03 PeterRadloff  
PDFpdf 2011_streams_quarter_day.pdf r1 manage 132.1 K 2012-06-01 - 16:03 PeterRadloff  
PNGpng 2011_streams_quarter_day.png r1 manage 37.2 K 2012-06-01 - 16:03 PeterRadloff  
Unknown file formateps 2012_streams_quarter_day.eps r2 r1 manage 55.1 K 2012-09-20 - 19:32 PeterRadloff  
PDFpdf 2012_streams_quarter_day.pdf r1 manage 153.1 K 2012-06-01 - 16:03 PeterRadloff  
PNGpng 2012_streams_quarter_day.png r2 r1 manage 46.3 K 2012-09-20 - 19:32 PeterRadloff  
PDFpdf 2017MenuPub_fig1.pdf r1 manage 27.8 K 2018-09-27 - 16:07 SavannaShaw Trigger Operations plots from 2017 Menu Pub Note
PNGpng 2017MenuPub_fig1.png r1 manage 33.5 K 2018-09-27 - 16:07 SavannaShaw Trigger Operations plots from 2017 Menu Pub Note
PDFpdf 2017MenuPub_fig2a.pdf r1 manage 37.1 K 2018-09-27 - 16:07 SavannaShaw Trigger Operations plots from 2017 Menu Pub Note
PNGpng 2017MenuPub_fig2a.png r1 manage 50.8 K 2018-09-27 - 16:07 SavannaShaw Trigger Operations plots from 2017 Menu Pub Note
PDFpdf 2017MenuPub_fig2b.pdf r1 manage 35.8 K 2018-09-27 - 16:07 SavannaShaw Trigger Operations plots from 2017 Menu Pub Note
PNGpng 2017MenuPub_fig2b.png r1 manage 37.5 K 2018-09-27 - 16:07 SavannaShaw Trigger Operations plots from 2017 Menu Pub Note
PDFpdf 2017_HLT_group_rates.pdf r1 manage 78.5 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PNGpng 2017_HLT_group_rates.png r1 manage 248.8 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PDFpdf 2017_HLT_stream_rates.pdf r1 manage 91.1 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PNGpng 2017_HLT_stream_rates.png r1 manage 239.6 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PDFpdf 2017_L1_multi-item_rates.pdf r1 manage 27.1 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PNGpng 2017_L1_multi-item_rates.png r1 manage 181.6 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PDFpdf 2017_L1_single-item_rates.pdf r1 manage 27.0 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PNGpng 2017_L1_single-item_rates.png r1 manage 160.8 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PNGpng 2017_pie_bandwidth.png r1 manage 150.9 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
PNGpng 2017_pie_rates.png r1 manage 166.1 K 2017-08-15 - 14:46 MarkStockton 2017 Rate plots
Unknown file formateps ATL-COM-DAQ-2012-033-fig12.eps r1 manage 15.1 K 2012-05-31 - 16:27 WillButtinger  
PNGpng ATL-COM-DAQ-2012-033-fig12.png r1 manage 15.7 K 2012-05-31 - 16:27 WillButtinger  
Unknown file formateps ATL-COM-DAQ-2012-033-fig13a.eps r1 manage 54.2 K 2012-05-14 - 17:53 WillButtinger  
PNGpng ATL-COM-DAQ-2012-033-fig13a.png r1 manage 22.7 K 2012-05-14 - 17:53 WillButtinger  
Unknown file formateps ATL-COM-DAQ-2012-033-fig13b.eps r1 manage 47.9 K 2012-05-14 - 17:53 WillButtinger  
PNGpng ATL-COM-DAQ-2012-033-fig13b.png r1 manage 18.7 K 2012-05-14 - 17:53 WillButtinger  
Unknown file formateps ATL-COM-DAQ-2012-033-fig14a.eps r1 manage 54.9 K 2012-05-14 - 17:53 WillButtinger  
PNGpng ATL-COM-DAQ-2012-033-fig14a.png r1 manage 21.4 K 2012-05-14 - 17:53 WillButtinger  
Unknown file formateps ATL-COM-DAQ-2012-033-fig14b.eps r1 manage 48.2 K 2012-05-14 - 17:53 WillButtinger  
PNGpng ATL-COM-DAQ-2012-033-fig14b.png r1 manage 17.2 K 2012-05-14 - 17:53 WillButtinger  
PDFpdf ATL-COM-DAQ-2018-170-Public_plot_1.pdf r1 manage 55.3 K 2019-03-01 - 13:49 KateWhalen 2018 L1Topo hardware-simulation mismatch plots
PNGpng ATL-COM-DAQ-2018-170-Public_plot_1.png r1 manage 248.3 K 2019-03-01 - 13:49 KateWhalen 2018 L1Topo hardware-simulation mismatch plots
PDFpdf ATL-COM-DAQ-2018-170-Public_plot_2.pdf r1 manage 54.2 K 2019-03-01 - 13:49 KateWhalen 2018 L1Topo hardware-simulation mismatch plots
PNGpng ATL-COM-DAQ-2018-170-Public_plot_2.png r1 manage 226.0 K 2019-03-01 - 13:49 KateWhalen 2018 L1Topo hardware-simulation mismatch plots
Unknown file formateps ATLASphase.eps r1 manage 17.1 K 2011-09-21 - 18:13 MichiruKaneda  
PNGpng ATLASphase.png r1 manage 20.3 K 2011-09-21 - 18:14 MichiruKaneda  
PDFpdf Bandwidth2015.pdf r1 manage 97.7 K 2016-07-31 - 11:52 CatrinBernius 2016 and 2015 bandwidth plots
PNGpng Bandwidth2015.png r1 manage 365.7 K 2016-07-31 - 11:59 CatrinBernius  
PDFpdf Bandwidth2016.pdf r1 manage 186.1 K 2016-07-31 - 11:52 CatrinBernius 2016 and 2015 bandwidth plots
PNGpng Bandwidth2016.png r1 manage 637.4 K 2016-07-31 - 11:59 CatrinBernius  
PDFpdf BandwidthPie2015.pdf r1 manage 111.0 K 2016-07-31 - 11:52 CatrinBernius 2016 and 2015 bandwidth plots
PNGpng BandwidthPie2015.png r1 manage 409.7 K 2016-07-31 - 11:59 CatrinBernius  
PDFpdf BandwidthPie2016.pdf r1 manage 117.1 K 2016-07-31 - 11:52 CatrinBernius 2016 and 2015 bandwidth plots
PNGpng BandwidthPie2016.png r1 manage 320.4 K 2016-07-31 - 11:59 CatrinBernius  
PDFpdf CostMon2021-fig1.pdf r1 manage 27.6 K 2021-09-15 - 15:44 AntoniaStrubig 2021 cost monitoring
PNGpng CostMon2021-fig1_1.png r1 manage 9.9 K 2021-09-15 - 15:48 AntoniaStrubig  
PNGpng CostMon2021-fig1_2.png r1 manage 11.9 K 2021-09-15 - 15:48 AntoniaStrubig  
PDFpdf CostMon2021-fig1_alt.pdf r1 manage 125.3 K 2022-04-29 - 12:24 AntoniaStrubig 2021 Cost Mon framework
PDFpdf CostMon2021-fig2.pdf r1 manage 27.5 K 2021-09-15 - 15:44 AntoniaStrubig 2021 cost monitoring
PNGpng CostMon2021-fig2_1.png r1 manage 9.7 K 2021-09-15 - 15:48 AntoniaStrubig  
PNGpng CostMon2021-fig2_2.png r1 manage 11.8 K 2021-09-15 - 15:48 AntoniaStrubig  
PDFpdf CostMon2021-fig2_alt.pdf r1 manage 54.7 K 2022-04-29 - 12:24 AntoniaStrubig 2021 Cost Mon framework
PDFpdf CostMon2021-fig3.pdf r1 manage 134.1 K 2021-09-15 - 15:44 AntoniaStrubig 2021 cost monitoring
PNGpng CostMon2021-fig3.png r1 manage 130.2 K 2021-09-15 - 15:48 AntoniaStrubig  
PDFpdf CostMon2021-fig3_alt.pdf r1 manage 110.3 K 2022-04-29 - 12:24 AntoniaStrubig 2021 Cost Mon framework
PDFpdf CostMon2021-fig4.pdf r1 manage 150.7 K 2021-09-15 - 15:44 AntoniaStrubig 2021 cost monitoring
PNGpng CostMon2021-fig4.png r1 manage 149.8 K 2021-09-15 - 15:48 AntoniaStrubig  
PDFpdf CostMon2021-fig4_alt.pdf r1 manage 115.2 K 2022-04-29 - 12:24 AntoniaStrubig 2021 Cost Mon framework
PDFpdf CostMon2021-fig5.pdf r1 manage 96.4 K 2021-09-15 - 15:44 AntoniaStrubig 2021 cost monitoring
PNGpng CostMon2021-fig5.png r1 manage 91.4 K 2021-09-15 - 15:48 AntoniaStrubig  
PDFpdf CostMon2021-fig5_alt.pdf r1 manage 68.0 K 2022-04-29 - 12:24 AntoniaStrubig 2021 Cost Mon framework
PDFpdf CostMon2021-fig6.pdf r1 manage 16.6 K 2021-09-15 - 15:44 AntoniaStrubig 2021 cost monitoring
PNGpng CostMon2021-fig6.png r1 manage 8.4 K 2021-09-15 - 15:48 AntoniaStrubig  
PDFpdf CostMon2021-fig6_alt.pdf r1 manage 20.3 K 2022-04-29 - 12:24 AntoniaStrubig 2021 Cost Mon framework
Unknown file formateps DTvsLB_Run183081.eps r1 manage 27.9 K 2011-09-22 - 10:50 GabrielAnders  
GIFgif DTvsLB_Run183081.gif r1 manage 14.0 K 2011-09-22 - 10:51 GabrielAnders  
Unknown file formateps DeltaBC_r186275.eps r1 manage 17.3 K 2011-09-21 - 18:14 MichiruKaneda  
PNGpng DeltaBC_r186275.png r1 manage 18.1 K 2011-09-21 - 18:14 MichiruKaneda  
Unknown file formateps DeltaBC_r186275_large.eps r1 manage 8.3 K 2011-09-21 - 18:14 MichiruKaneda  
PNGpng DeltaBC_r186275_large.png r1 manage 14.1 K 2011-09-21 - 18:15 MichiruKaneda  
Unknown file formateps EF_multiJet2011.eps r1 manage 17.6 K 2013-01-09 - 19:48 DavideGerbaudo  
PNGpng EF_multiJet2011.png r1 manage 30.8 K 2013-01-09 - 19:48 DavideGerbaudo  
Unknown file formateps EF_rates_vs_lumi_1.eps r1 manage 126.2 K 2011-10-31 - 18:52 JoergStelzer  
PDFpdf EF_rates_vs_lumi_1.pdf r1 manage 45.1 K 2011-10-31 - 18:53 JoergStelzer  
PNGpng EF_rates_vs_lumi_1.png r1 manage 43.2 K 2011-10-31 - 18:42 JoergStelzer  
Unknown file formateps EF_rates_vs_lumi_2012_AB.eps r1 manage 41.9 K 2012-05-16 - 15:01 JoergStelzer EF rates of various primary triggers Periods A+B 2012
GIFgif EF_rates_vs_lumi_2012_AB.gif r1 manage 17.3 K 2012-05-16 - 15:01 JoergStelzer EF rates of various primary triggers Periods A+B 2012
PDFpdf EF_rates_vs_lumi_2012_AB.pdf r1 manage 132.0 K 2012-05-16 - 15:01 JoergStelzer EF rates of various primary triggers Periods A+B 2012
Unknown file formateps EF_singleJet2011.eps r1 manage 19.5 K 2013-01-09 - 19:49 DavideGerbaudo  
PNGpng EF_singleJet2011.png r1 manage 32.0 K 2013-01-09 - 19:49 DavideGerbaudo  
Unknown file formateps HI-levels.eps r1 manage 42.8 K 2011-12-14 - 14:08 IwonaGrabowska  
PDFpdf HI-levels.pdf r1 manage 26.8 K 2011-12-14 - 14:16 IwonaGrabowska  
PNGpng HI-levels.png r1 manage 21.0 K 2011-12-14 - 14:07 IwonaGrabowska  
PDFpdf HI-sizes.pdf r1 manage 23.3 K 2011-12-14 - 14:07 IwonaGrabowska  
PNGpng HI-sizes.png r1 manage 24.9 K 2011-12-14 - 14:07 IwonaGrabowska  
PDFpdf HI-streams.pdf r1 manage 33.4 K 2011-12-14 - 14:15 IwonaGrabowska  
PNGpng HI-streams.png r1 manage 23.6 K 2011-12-14 - 14:06 IwonaGrabowska  
PDFpdf HLTProcessing_HLTTimePerEvent_439529.pdf r1 manage 15.3 K 2023-04-28 - 17:15 AleksandraPoreba 2022 Cost Monitoring Results
PNGpng HLTProcessing_HLTTimePerEvent_439529.png r1 manage 13.5 K 2023-04-28 - 17:15 AleksandraPoreba 2022 Cost Monitoring Results
PDFpdf HLTProcessing_HLTTimePerPileup_439529.pdf r1 manage 16.1 K 2023-04-28 - 17:15 AleksandraPoreba 2022 Cost Monitoring Results
PNGpng HLTProcessing_HLTTimePerPileup_439529.png r1 manage 14.8 K 2023-04-28 - 17:15 AleksandraPoreba 2022 Cost Monitoring Results
PDFpdf HLT_rates_HLT.pdf r1 manage 39.7 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng HLT_rates_HLT.png r1 manage 57.9 K 2022-09-14 - 11:41 FrancescoGiuli  
PDFpdf HLT_rates_HLT_2.pdf r1 manage 38.9 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng HLT_rates_HLT_2.png r1 manage 64.2 K 2022-09-14 - 11:41 FrancescoGiuli  
PDFpdf HLT_rates_HLT_3.pdf r1 manage 35.6 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng HLT_rates_HLT_3.png r1 manage 51.2 K 2022-09-14 - 11:41 FrancescoGiuli  
PDFpdf HLT_rates_HLT_4.pdf r1 manage 36.6 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng HLT_rates_HLT_4.png r1 manage 57.2 K 2022-09-14 - 11:41 FrancescoGiuli  
Unknown file formateps HT_EfficiencyTurnOn.eps r1 manage 20.9 K 2016-06-02 - 17:02 CatrinBernius L1Topo Commissioning plots
PDFpdf HT_EfficiencyTurnOn.pdf r1 manage 20.4 K 2016-06-02 - 17:02 CatrinBernius L1Topo Commissioning plots
PNGpng HT_EfficiencyTurnOn.png r1 manage 14.3 K 2016-06-02 - 17:02 CatrinBernius L1Topo Commissioning plots
Unknown file formateps IBL_TAP_L1Exp.eps r1 manage 10.3 K 2015-11-27 - 16:36 MartinZurNedden  
PDFpdf IBL_TAP_L1Exp.pdf r1 manage 5.9 K 2015-11-27 - 16:36 MartinZurNedden  
PNGpng IBL_TAP_L1Exp.png r1 manage 17.2 K 2015-11-27 - 16:36 MartinZurNedden  
Unknown file formateps L1TOPO_RATE_HT_RT2016.eps r1 manage 21.2 K 2016-06-02 - 17:02 CatrinBernius L1Topo Commissioning plots
PDFpdf L1TOPO_RATE_HT_RT2016.pdf r1 manage 39.4 K 2016-06-02 - 17:02 CatrinBernius L1Topo Commissioning plots
PNGpng L1TOPO_RATE_HT_RT2016.png r1 manage 16.9 K 2016-06-02 - 17:22 CatrinBernius  
Unknown file formateps L1_ALLTAU_edited.eps r1 manage 12.0 K 2016-09-21 - 11:40 CatrinBernius  
PDFpdf L1_ALLTAU_edited.pdf r1 manage 55.3 K 2016-09-21 - 11:40 CatrinBernius  
PNGpng L1_ALLTAU_edited.png r1 manage 131.4 K 2016-09-21 - 11:40 CatrinBernius  
Unknown file formateps L1_multiJet2011.eps r1 manage 15.8 K 2013-01-09 - 19:48 DavideGerbaudo  
PNGpng L1_multiJet2011.png r1 manage 27.3 K 2013-01-09 - 19:48 DavideGerbaudo  
PDFpdf L1_rates_L1.pdf r1 manage 35.1 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng L1_rates_L1.png r1 manage 54.9 K 2022-09-14 - 11:41 FrancescoGiuli  
PDFpdf L1_rates_L1_2.pdf r1 manage 35.0 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng L1_rates_L1_2.png r1 manage 52.9 K 2022-09-14 - 11:41 FrancescoGiuli  
Unknown file formateps L1_singleJet2011.eps r1 manage 16.1 K 2013-01-09 - 19:49 DavideGerbaudo  
PNGpng L1_singleJet2011.png r1 manage 26.5 K 2013-01-09 - 19:49 DavideGerbaudo  
JPEGjpg L1rate.jpg r1 manage 444.7 K 2016-09-15 - 12:49 CatrinBernius L1Topo plots Sept 2016
PDFpdf L1rate.pdf r1 manage 31.8 K 2016-09-15 - 12:49 CatrinBernius L1Topo plots Sept 2016
PNGpng L1rate.png r1 manage 127.0 K 2016-09-15 - 12:49 CatrinBernius L1Topo plots Sept 2016
PDFpdf L1rate_2016.pdf r1 manage 316.7 K 2017-07-04 - 09:41 MarkStockton L1Rate 2016
PNGpng L1rate_2016.png r1 manage 240.8 K 2017-07-04 - 09:41 MarkStockton L1Rate 2016
Unknown file formateps L2TotalTime.eps r1 manage 7.2 K 2011-06-04 - 12:37 JoergStelzer  
GIFgif L2TotalTime.gif r1 manage 8.0 K 2011-06-04 - 12:37 JoergStelzer  
Unknown file formateps L2_multiJet2011.eps r1 manage 16.0 K 2013-01-09 - 19:48 DavideGerbaudo  
PNGpng L2_multiJet2011.png r1 manage 30.9 K 2013-01-09 - 19:48 DavideGerbaudo  
Unknown file formateps L2_singleJet2011.eps r1 manage 17.6 K 2013-01-09 - 19:49 DavideGerbaudo  
PNGpng L2_singleJet2011.png r1 manage 30.2 K 2013-01-09 - 19:49 DavideGerbaudo  
Unknown file formateps LHCphase.eps r1 manage 17.1 K 2011-09-21 - 18:15 MichiruKaneda  
PNGpng LHCphase.png r1 manage 24.7 K 2011-09-21 - 18:15 MichiruKaneda  
Unknown file formateps Lumi_GroupRateStack.eps r1 manage 117.4 K 2016-02-18 - 19:21 MartinZurNedden  
PDFpdf Lumi_GroupRateStack.pdf r1 manage 27.3 K 2016-02-18 - 19:21 MartinZurNedden  
PNGpng Lumi_GroupRateStack.png r1 manage 37.3 K 2016-02-18 - 19:21 MartinZurNedden  
Unknown file formateps Lumi_HLTRate.eps r2 r1 manage 73.5 K 2016-02-18 - 19:22 MartinZurNedden  
PDFpdf Lumi_HLTRate.pdf r2 r1 manage 173.4 K 2016-02-18 - 19:22 MartinZurNedden  
PNGpng Lumi_HLTRate.png r2 r1 manage 43.0 K 2016-02-18 - 19:22 MartinZurNedden  
Unknown file formateps Lumi_HLTRate_Stack.eps r1 manage 52.6 K 2016-02-18 - 19:22 MartinZurNedden  
PDFpdf Lumi_HLTRate_Stack.pdf r1 manage 15.9 K 2016-02-18 - 19:22 MartinZurNedden  
PNGpng Lumi_HLTRate_Stack.png r1 manage 31.0 K 2016-02-18 - 19:22 MartinZurNedden  
Unknown file formateps Lumi_L1Rate_1.eps r2 r1 manage 81.9 K 2016-02-18 - 19:37 MartinZurNedden  
PDFpdf Lumi_L1Rate_1.pdf r1 manage 177.7 K 2016-02-18 - 19:23 MartinZurNedden  
PNGpng Lumi_L1Rate_1.png r2 r1 manage 39.2 K 2016-02-18 - 19:37 MartinZurNedden  
Unknown file formateps Lumi_L1Rate_2.eps r1 manage 69.7 K 2016-02-18 - 19:23 MartinZurNedden  
PDFpdf Lumi_L1Rate_2.pdf r1 manage 137.9 K 2016-02-18 - 19:23 MartinZurNedden  
PNGpng Lumi_L1Rate_2.png r1 manage 44.9 K 2016-02-18 - 19:23 MartinZurNedden  
Unknown file formateps PerBunchDt_Run183081.eps r1 manage 13.2 K 2011-09-22 - 11:09 GabrielAnders  
GIFgif PerBunchDt_Run183081.gif r1 manage 15.9 K 2011-09-22 - 11:09 GabrielAnders  
Unknown file formateps Run141811_Track_d0_phi.eps r1 manage 48.9 K 2011-06-12 - 16:37 JoergStelzer  
PNGpng Run141811_Track_d0_phi.png r1 manage 23.0 K 2011-06-12 - 16:36 JoergStelzer  
Unknown file formateps Run141811_X.eps r1 manage 17.5 K 2011-06-12 - 16:09 JoergStelzer  
PNGpng Run141811_X.png r1 manage 23.4 K 2011-06-12 - 16:09 JoergStelzer  
Unknown file formateps Run141811_Y.eps r1 manage 16.9 K 2011-06-12 - 16:10 JoergStelzer  
PNGpng Run141811_Y.png r1 manage 23.5 K 2011-06-12 - 16:10 JoergStelzer  
Unknown file formateps Run141811_Z_BCID_1_2674_largeBins.eps r1 manage 24.2 K 2011-06-12 - 16:35 JoergStelzer  
PNGpng Run141811_Z_BCID_1_2674_largeBins.png r1 manage 32.6 K 2011-06-12 - 16:34 JoergStelzer  
PNGpng Run142193_TrackPhiPass_vs_TrackD0Pass.png r1 manage 30.5 K 2011-06-12 - 16:39 JoergStelzer  
PNGpng Run142193_X.png r1 manage 23.5 K 2011-06-12 - 16:39 JoergStelzer  
PNGpng Run142193_XvsY.png r1 manage 24.4 K 2011-06-12 - 16:39 JoergStelzer  
PNGpng Run142193_Y.png r1 manage 23.9 K 2011-06-12 - 16:38 JoergStelzer  
PNGpng Run142193_Z.png r1 manage 23.2 K 2011-06-12 - 16:38 JoergStelzer  
Unknown file formateps Run142383_NVertex_BCID_reProcessing_IDSCAN_AtlasPrelim.eps r1 manage 11.9 K 2011-06-12 - 16:04 JoergStelzer  
PNGpng Run142383_NVertex_BCID_reProcessing_IDSCAN_AtlasPrelim.png r1 manage 19.9 K 2011-06-12 - 16:05 JoergStelzer  
Unknown file formateps Run142383_VertexX_BCID_reProcessing_IDSCAN_AtlasPrelim.eps r1 manage 11.1 K 2011-06-12 - 16:05 JoergStelzer  
PNGpng Run142383_VertexX_BCID_reProcessing_IDSCAN_AtlasPrelim.png r1 manage 16.6 K 2011-06-12 - 15:56 JoergStelzer  
Unknown file formateps Run142383_VertexY_BCID_reProcessing_IDSCAN_AtlasPrelim.eps r1 manage 11.5 K 2011-06-12 - 16:05 JoergStelzer  
PNGpng Run142383_VertexY_BCID_reProcessing_IDSCAN_AtlasPrelim.png r1 manage 17.1 K 2011-06-12 - 16:06 JoergStelzer  
Unknown file formateps Run142383_VertexZ_BCID_reProcessing_IDSCAN_AtlasPrelim.eps r1 manage 12.0 K 2011-06-12 - 16:06 JoergStelzer  
PNGpng Run142383_VertexZ_BCID_reProcessing_IDSCAN_AtlasPrelim.png r1 manage 17.3 K 2011-06-12 - 16:07 JoergStelzer  
Unknown file formateps Run152166_TiltX_reProcessing_SiTrack_AtlasPrelim.eps r1 manage 14.0 K 2011-06-12 - 15:38 JoergStelzer  
PNGpng Run152166_TiltX_reProcessing_SiTrack_AtlasPrelim.png r1 manage 17.9 K 2011-06-12 - 15:39 JoergStelzer  
Unknown file formateps Run152166_TiltY_reProcessing_SiTrack_AtlasPrelim.eps r1 manage 14.1 K 2011-06-12 - 15:57 JoergStelzer  
PNGpng Run152166_TiltY_reProcessing_SiTrack_AtlasPrelim.png r1 manage 17.3 K 2011-06-12 - 15:40 JoergStelzer  
Unknown file formateps Run152166_TrackD0Phi_reProcessing_SiTrack_AtlasPrelim.eps r1 manage 14.6 K 2011-06-12 - 15:44 JoergStelzer  
PNGpng Run152166_TrackD0Phi_reProcessing_SiTrack_AtlasPrelim.png r1 manage 23.0 K 2011-06-12 - 15:40 JoergStelzer  
Unknown file formateps Run152166_VertexX_reProcessing_SiTrack_AtlasPrelim.eps r1 manage 14.9 K 2011-06-12 - 15:59 JoergStelzer  
PNGpng Run152166_VertexX_reProcessing_SiTrack_AtlasPrelim.png r1 manage 18.7 K 2011-06-12 - 14:27 JoergStelzer  
PNGpng Run152166_VertexY_reProcessing_SiTrack_AtlasPrelim.png r1 manage 18.4 K 2011-06-12 - 15:41 JoergStelzer  
Unknown file formateps Run152166_VertexZ_reProcessing_SiTrack_AtlasPrelim.eps r1 manage 15.8 K 2011-06-12 - 15:57 JoergStelzer  
PNGpng Run152166_VertexZ_reProcessing_SiTrack_AtlasPrelim.png r1 manage 19.2 K 2011-06-12 - 15:43 JoergStelzer  
PDFpdf StreamRates-noShade.pdf r1 manage 62.7 K 2012-01-31 - 10:37 PeterRadloff  
PNGpng StreamRates-noShade.png r1 manage 26.4 K 2012-01-31 - 10:52 PeterRadloff  
PDFpdf StreamRates.pdf r1 manage 86.8 K 2011-10-02 - 23:13 DavidMStrom  
PNGpng StreamRates.png r1 manage 88.9 K 2011-10-02 - 23:14 DavidMStrom  
PDFpdf StreamRates2012.pdf r2 r1 manage 73.3 K 2013-02-03 - 19:40 BrianPetersen EF stream rates per month in 2012
PNGpng StreamRates2012.png r2 r1 manage 83.0 K 2013-02-03 - 19:40 BrianPetersen EF stream rates per month in 2012
Unknown file formateps StreamRatesSizes.eps r1 manage 61.5 K 2011-10-31 - 18:53 JoergStelzer  
PDFpdf StreamRatesSizes.pdf r1 manage 14.5 K 2011-10-31 - 19:22 JoergStelzer  
PNGpng StreamRatesSizes.png r1 manage 19.1 K 2011-10-31 - 18:41 JoergStelzer  
PDFpdf Stream_BW.pdf r1 manage 39.9 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng Stream_BW.png r1 manage 54.6 K 2022-09-14 - 11:41 FrancescoGiuli  
PDFpdf Stream_BW_log.pdf r1 manage 40.3 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng Stream_BW_log.png r1 manage 56.3 K 2022-09-14 - 11:41 FrancescoGiuli  
PDFpdf Stream_rate_Nov22.pdf r1 manage 37.9 K 2023-04-28 - 17:14 AleksandraPoreba 2022 Rate Plots
PNGpng Stream_rate_Nov22.png r1 manage 127.0 K 2023-04-28 - 17:14 AleksandraPoreba 2022 Rate Plots
PDFpdf Stream_rates_Stream.pdf r1 manage 40.8 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng Stream_rates_Stream.png r1 manage 59.1 K 2022-09-14 - 11:41 FrancescoGiuli  
PDFpdf Stream_rates_Stream_log.pdf r1 manage 41.9 K 2022-09-14 - 11:39 FrancescoGiuli  
PNGpng Stream_rates_Stream_log.png r1 manage 58.7 K 2022-09-14 - 11:41 FrancescoGiuli  
Unknown file formateps TAPvcBCID_Run183081.eps r1 manage 11.4 K 2011-09-22 - 14:42 GabrielAnders  
Unknown file formateps TAPvsBCID_Run183081.eps r1 manage 11.4 K 2011-09-22 - 14:45 GabrielAnders  
GIFgif TAPvsBCID_Run183081.gif r1 manage 9.3 K 2011-09-22 - 10:52 GabrielAnders  
PDFpdf TLAPublicWinter2019_L1Lumi.pdf r1 manage 39.2 K 2019-03-01 - 13:24 KateWhalen 2018 TLA L1 rates
PNGpng TLAPublicWinter2019_L1Lumi.png r1 manage 97.9 K 2019-03-01 - 13:24 KateWhalen 2018 TLA L1 rates
Unknown file formateps TLAPublicWinter2019_L1RatesVsLumi_359872.eps r1 manage 66.5 K 2019-03-19 - 15:21 KateWhalen 2018 TLA L1 rates vs. lumi
PDFpdf TLAPublicWinter2019_L1RatesVsLumi_359872.pdf r1 manage 39.1 K 2019-03-19 - 15:21 KateWhalen 2018 TLA L1 rates vs. lumi
PNGpng TLAPublicWinter2019_L1RatesVsLumi_359872.png r1 manage 61.9 K 2019-03-19 - 15:21 KateWhalen 2018 TLA L1 rates vs. lumi
Unknown file formateps TLA_BW_May2018.eps r1 manage 45.0 K 2018-05-29 - 18:30 KateWhalen 2018 TLA rate and bandwidth plots
PNGpng TLA_BW_May2018.png r1 manage 102.5 K 2018-05-30 - 18:10 KateWhalen 2018 TLA rate and bandwidth plots
Unknown file formateps TLA_BW_Oct2017.eps r1 manage 47.7 K 2018-05-29 - 18:19 KateWhalen 2017 TLA rate and bandwidth plots
PNGpng TLA_BW_Oct2017.png r1 manage 111.9 K 2018-05-30 - 18:12 KateWhalen 2017 TLA rate and bandwidth plots
Unknown file formateps TLA_BW_log_May2018.eps r1 manage 60.7 K 2018-05-29 - 18:30 KateWhalen 2018 TLA rate and bandwidth plots
PNGpng TLA_BW_log_May2018.png r1 manage 90.7 K 2018-05-30 - 19:56 JoergStelzer 2018 TLA rate and bandwidth plots
Unknown file formateps TLA_BW_log_Oct2017.eps r1 manage 65.3 K 2018-05-29 - 18:19 KateWhalen 2017 TLA rate and bandwidth plots
PNGpng TLA_BW_log_Oct2017.png r1 manage 112.8 K 2018-05-30 - 18:12 KateWhalen 2017 TLA rate and bandwidth plots
Unknown file formateps TLA_L1Lumi_May2018.eps r1 manage 49.3 K 2018-05-29 - 18:30 KateWhalen 2018 TLA rate and bandwidth plots
PNGpng TLA_L1Lumi_May2018.png r1 manage 122.5 K 2018-05-30 - 18:10 KateWhalen 2018 TLA rate and bandwidth plots
Unknown file formateps TLA_L1Lumi_Oct2017.eps r1 manage 51.2 K 2018-05-29 - 18:19 KateWhalen 2017 TLA rate and bandwidth plots
PNGpng TLA_L1Lumi_Oct2017.png r1 manage 132.5 K 2018-05-30 - 18:12 KateWhalen 2017 TLA rate and bandwidth plots
Unknown file formateps TLA_RateBW_May2018.eps r1 manage 50.1 K 2018-05-29 - 18:30 KateWhalen 2018 TLA rate and bandwidth plots
PNGpng TLA_RateBW_May2018.png r1 manage 127.9 K 2018-05-30 - 18:10 KateWhalen 2018 TLA rate and bandwidth plots
Unknown file formateps TLA_RateBW_Oct2017.eps r1 manage 52.0 K 2018-05-29 - 18:19 KateWhalen 2017 TLA rate and bandwidth plots
PNGpng TLA_RateBW_Oct2017.png r1 manage 138.2 K 2018-05-30 - 18:12 KateWhalen 2017 TLA rate and bandwidth plots
Unknown file formateps TLA_Rate_May2018.eps r1 manage 44.7 K 2018-05-29 - 18:30 KateWhalen 2018 TLA rate and bandwidth plots
PNGpng TLA_Rate_May2018.png r1 manage 95.3 K 2018-05-30 - 19:32 JoergStelzer  
Unknown file formateps TLA_Rate_Oct2017.eps r1 manage 48.4 K 2018-05-29 - 18:19 KateWhalen 2017 TLA rate and bandwidth plots
PNGpng TLA_Rate_Oct2017.png r1 manage 121.4 K 2018-05-30 - 18:12 KateWhalen 2017 TLA rate and bandwidth plots
Unknown file formateps Time_HLTGroupRate_Stack_2015_10.eps r1 manage 112.9 K 2016-07-23 - 19:27 MartinZurNedden  
PDFpdf Time_HLTGroupRate_Stack_2015_10.pdf r1 manage 77.0 K 2016-07-23 - 19:27 MartinZurNedden  
PNGpng Time_HLTGroupRate_Stack_2015_10.png r1 manage 24.6 K 2016-07-23 - 19:27 MartinZurNedden  
Unknown file formateps Time_HLTGroupRate_Stack_2016_07.eps r1 manage 114.3 K 2016-07-23 - 19:27 MartinZurNedden  
PDFpdf Time_HLTGroupRate_Stack_2016_07.pdf r1 manage 50.2 K 2016-07-23 - 19:27 MartinZurNedden  
PNGpng Time_HLTGroupRate_Stack_2016_07.png r1 manage 28.9 K 2016-07-23 - 19:27 MartinZurNedden  
Unknown file formateps Time_HLTRate_Stack.eps r1 manage 152.5 K 2016-05-31 - 00:04 MartinZurNedden Trigger Operation Performance plots, Run 299584, May 2016
PDFpdf Time_HLTRate_Stack.pdf r1 manage 120.9 K 2016-05-31 - 00:04 MartinZurNedden Trigger Operation Performance plots, Run 299584, May 2016
PNGpng Time_HLTRate_Stack.png r1 manage 28.5 K 2016-05-31 - 00:04 MartinZurNedden Trigger Operation Performance plots, Run 299584, May 2016
Unknown file formateps Time_HLTRate_Stack_2015_10.eps r1 manage 53.6 K 2016-07-23 - 19:27 MartinZurNedden  
PDFpdf Time_HLTRate_Stack_2015_10.pdf r1 manage 45.1 K 2016-07-23 - 19:27 MartinZurNedden  
PNGpng Time_HLTRate_Stack_2015_10.png r1 manage 21.1 K 2016-07-23 - 19:27 MartinZurNedden  
Unknown file formateps Time_HLTRate_Stack_2016_07.eps r1 manage 72.6 K 2016-07-23 - 19:27 MartinZurNedden  
PDFpdf Time_HLTRate_Stack_2016_07.pdf r1 manage 37.9 K 2016-07-23 - 19:27 MartinZurNedden  
PNGpng Time_HLTRate_Stack_2016_07.png r1 manage 26.7 K 2016-07-23 - 19:27 MartinZurNedden  
Unknown file formateps Time_L1GroupRate_Stack.eps r1 manage 189.6 K 2016-05-31 - 00:04 MartinZurNedden Trigger Operation Performance plots, Run 299584, May 2016
PDFpdf Time_L1GroupRate_Stack.pdf r1 manage 119.0 K 2016-05-31 - 00:04 MartinZurNedden Trigger Operation Performance plots, Run 299584, May 2016
PNGpng Time_L1GroupRate_Stack.png r1 manage 25.4 K 2016-05-31 - 00:04 MartinZurNedden Trigger Operation Performance plots, Run 299584, May 2016
Unknown file formateps Time_L1GroupRate_Stack_2015_10.eps r1 manage 83.1 K 2016-07-23 - 19:27 MartinZurNedden  
PDFpdf Time_L1GroupRate_Stack_2015_10.pdf r1 manage 59.2 K 2016-07-23 - 19:27 MartinZurNedden  
PNGpng Time_L1GroupRate_Stack_2015_10.png r1 manage 19.5 K 2016-07-23 - 19:27 MartinZurNedden  
Unknown file formateps Time_L1GroupRate_Stack_2016_07.eps r1 manage 74.1 K 2016-07-23 - 19:27 MartinZurNedden  
PDFpdf Time_L1GroupRate_Stack_2016_07.pdf r1 manage 36.4 K 2016-07-23 - 19:27 MartinZurNedden  
PNGpng Time_L1GroupRate_Stack_2016_07.png r1 manage 22.1 K 2016-07-23 - 19:27 MartinZurNedden  
Unknown file formateps TotalOutput_AllLevels.eps r1 manage 184.4 K 2011-10-31 - 18:54 JoergStelzer  
PDFpdf TotalOutput_AllLevels.pdf r1 manage 85.7 K 2011-10-31 - 19:18 JoergStelzer  
PNGpng TotalOutput_AllLevels.png r1 manage 22.0 K 2011-10-31 - 18:42 JoergStelzer  
Unknown file formateps Totals_time_lumilabel.eps r1 manage 17.8 K 2012-01-31 - 10:49 PeterRadloff  
PDFpdf Totals_time_lumilabel.pdf r1 manage 54.5 K 2012-01-31 - 10:49 PeterRadloff  
PNGpng Totals_time_lumilabel.png r1 manage 20.1 K 2012-01-31 - 10:39 PeterRadloff  
Unknown file formateps TrigIDSCAN_Tau_rob_cpu.eps r1 manage 10.4 K 2011-06-04 - 12:26 JoergStelzer  
PDFpdf TrigIDSCAN_Tau_rob_cpu.pdf r1 manage 14.5 K 2011-06-04 - 12:27 JoergStelzer  
PNGpng TrigIDSCAN_Tau_rob_cpu.png r1 manage 11.5 K 2011-06-04 - 12:25 JoergStelzer  
PDFpdf TrigOpsPublicWinter2019_HLT_group_rates_ATLASStyle_359872.pdf r1 manage 49.5 K 2019-03-01 - 11:21 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_HLT_group_rates_ATLASStyle_359872.png r1 manage 135.6 K 2019-03-01 - 13:12 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_HLT_stream_bandwidths_359872.pdf r1 manage 39.6 K 2019-03-01 - 11:21 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_HLT_stream_bandwidths_359872.png r1 manage 99.2 K 2019-03-01 - 13:12 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_HLT_stream_rates_ATLASStyle_359872.pdf r1 manage 41.3 K 2019-03-01 - 11:21 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_HLT_stream_rates_ATLASStyle_359872.png r1 manage 100.9 K 2019-03-01 - 13:12 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_L1_multi-item_rates_ATLASStyle_359872.pdf r1 manage 33.2 K 2019-03-01 - 11:21 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_L1_multi-item_rates_ATLASStyle_359872.png r1 manage 95.9 K 2019-03-01 - 13:12 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_L1_multi-item_rates_logscale_ATLASStyle_359872.pdf r1 manage 34.7 K 2019-03-19 - 15:36 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_L1_multi-item_rates_logscale_ATLASStyle_359872.png r1 manage 97.8 K 2019-03-19 - 15:36 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_L1_multi-item_rates_lumi_ATLASStyle_359872.pdf r1 manage 40.7 K 2019-03-01 - 11:21 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_L1_multi-item_rates_lumi_ATLASStyle_359872.png r1 manage 74.1 K 2019-03-01 - 13:12 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_L1_single-item_rates_ATLASStyle_359872.pdf r1 manage 30.5 K 2019-03-01 - 11:21 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_L1_single-item_rates_ATLASStyle_359872.png r1 manage 81.4 K 2019-03-01 - 13:12 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_L1_single-item_rates_logscale_ATLASStyle_359872.pdf r2 r1 manage 32.4 K 2019-08-29 - 15:13 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_L1_single-item_rates_logscale_ATLASStyle_359872.png r2 r1 manage 85.3 K 2019-08-29 - 15:14 KateWhalen 2018 rate and bandwidth plots
PDFpdf TrigOpsPublicWinter2019_L1_single-item_rates_lumi_ATLASStyle_359872.pdf r1 manage 39.1 K 2019-03-01 - 11:21 KateWhalen 2018 rate and bandwidth plots
PNGpng TrigOpsPublicWinter2019_L1_single-item_rates_lumi_ATLASStyle_359872.png r1 manage 64.3 K 2019-03-01 - 13:12 KateWhalen 2018 rate and bandwidth plots
Unknown file formateps TriggerRatePresenter2012.eps r1 manage 3041.4 K 2013-09-25 - 23:18 IvanaHristova TRP screenshot with 2012 rates
PDFpdf TriggerRatePresenter2012.pdf r1 manage 61.1 K 2013-09-25 - 23:17 IvanaHristova TRP screenshot with 2012 rates
PNGpng TriggerRatePresenter2012.png r1 manage 78.6 K 2013-09-25 - 23:17 IvanaHristova TRP screenshot with 2012 rates
PNGpng beamspotFullX.png r1 manage 23.7 K 2011-06-12 - 16:00 JoergStelzer  
PNGpng beamspotFullY.png r1 manage 23.7 K 2011-06-12 - 16:01 JoergStelzer  
PNGpng beamspotFullZ.png r1 manage 23.7 K 2011-06-12 - 16:02 JoergStelzer  
PNGpng beamspotNVertices.png r1 manage 26.5 K 2011-06-12 - 16:03 JoergStelzer  
PDFpdf caching-execs.pdf r1 manage 15.3 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PNGpng caching-execs.png r1 manage 19.9 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PDFpdf caching-l2accepted.pdf r1 manage 14.7 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PNGpng caching-l2accepted.png r1 manage 19.2 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PDFpdf caching-l2all.pdf r1 manage 14.7 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PNGpng caching-l2all.png r1 manage 18.8 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PDFpdf caching-l2rejected.pdf r1 manage 14.5 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PNGpng caching-l2rejected.png r1 manage 18.4 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PDFpdf caching-size.pdf r1 manage 14.7 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PNGpng caching-size.png r1 manage 18.1 K 2013-10-03 - 09:27 TomaszBold Performance of HLT algorithms execution caching, approved here: https://indico.cern.ch/conferenceDisplay.py?confId=274418
PNGpng ctpBusyPresenter.png r1 manage 200.1 K 2010-11-08 - 17:42 MarkStockton  
PNGpng ctpOnlineMon1.png r1 manage 90.1 K 2010-11-08 - 17:43 MarkStockton  
PDFpdf dqm_electron_overview.pdf r1 manage 516.3 K 2016-08-01 - 18:55 MarkStockton DQM+TRP screenshots
PNGpng dqm_electron_overview.png r1 manage 386.4 K 2016-08-01 - 19:27 MarkStockton DQM+TRP screenshots
Unknown file formateps eff_ditau_dr_L1DR-TAU20ITAU12I.eps r1 manage 9.9 K 2016-09-21 - 11:40 CatrinBernius  
PDFpdf eff_ditau_dr_L1DR-TAU20ITAU12I.pdf r1 manage 42.7 K 2016-09-21 - 11:40 CatrinBernius  
PNGpng eff_ditau_dr_L1DR-TAU20ITAU12I.png r1 manage 71.3 K 2016-09-21 - 11:40 CatrinBernius  
Unknown file formateps egamma_TrigIDSCAN_alg_time_RoI.eps r1 manage 10.5 K 2011-12-15 - 22:25 TaeMinHong  
PDFpdf egamma_TrigIDSCAN_alg_time_RoI.pdf r1 manage 28.0 K 2011-12-15 - 22:25 TaeMinHong  
PNGpng egamma_TrigIDSCAN_alg_time_RoI.png r1 manage 70.4 K 2011-12-15 - 22:25 TaeMinHong  
Unknown file formateps fill2686.eps r1 manage 22.2 K 2012-06-01 - 16:03 PeterRadloff  
PDFpdf fill2686.pdf r1 manage 43.9 K 2012-06-01 - 16:03 PeterRadloff  
PNGpng fill2686.png r1 manage 23.8 K 2012-06-01 - 16:03 PeterRadloff  
JPEGjpg invM_mumu.jpg r1 manage 482.6 K 2016-09-15 - 12:49 CatrinBernius L1Topo plots Sept 2016
PDFpdf invM_mumu.pdf r1 manage 53.7 K 2016-09-15 - 12:49 CatrinBernius L1Topo plots Sept 2016
PNGpng invM_mumu.png r1 manage 145.5 K 2016-09-15 - 12:49 CatrinBernius L1Topo plots Sept 2016
PDFpdf l1topo.326945.short.pdf r1 manage 16.8 K 2017-06-24 - 11:52 DavideGerbaudo L1Topo di-muon rate for run 326945
PNGpng l1topo.326945.short.png r2 r1 manage 11.9 K 2017-06-24 - 12:10 DavideGerbaudo L1Topo di-muon rate for run 326945
PDFpdf menuTable.pdf r1 manage 104.0 K 2015-11-28 - 18:58 AnnaSfyrla  
PNGpng menuTable.png r1 manage 1240.8 K 2015-11-28 - 18:58 AnnaSfyrla  
Unknown file formateps nL1Ain416_r188902.eps r1 manage 7.5 K 2011-09-21 - 18:15 MichiruKaneda  
PNGpng nL1Ain416_r188902.png r1 manage 10.7 K 2011-09-21 - 18:15 MichiruKaneda  
Unknown file formateps onlineLumiOverlay_norm_AtlasPrelim.eps r1 manage 33.4 K 2011-06-12 - 15:45 JoergStelzer  
PNGpng onlineLumiOverlay_norm_AtlasPrelim.png r1 manage 19.2 K 2011-06-12 - 15:45 JoergStelzer  
PNGpng pPb-levels.png r1 manage 21.6 K 2013-03-14 - 10:17 BrianPetersen  
PNGpng pPb-streams.png r1 manage 24.2 K 2013-03-14 - 10:17 BrianPetersen  
Unknown file formateps physics_stream_rate_average.eps r1 manage 12.4 K 2016-09-21 - 15:12 CatrinBernius  
PDFpdf physics_stream_rate_average.pdf r1 manage 15.3 K 2016-09-21 - 15:12 CatrinBernius  
PNGpng physics_stream_rate_average.png r1 manage 81.3 K 2016-09-21 - 15:12 CatrinBernius  
PDFpdf physics_stream_rate_average_main_delayed.pdf r1 manage 19.3 K 2016-12-16 - 18:02 MarkStockton  
PNGpng physics_stream_rate_average_main_delayed.png r1 manage 104.8 K 2016-12-16 - 18:02 MarkStockton  
PDFpdf physics_stream_rate_average_v2.pdf r1 manage 16.1 K 2016-12-16 - 18:02 MarkStockton  
PNGpng physics_stream_rate_average_v2.png r1 manage 86.0 K 2016-12-16 - 18:02 MarkStockton  
Unknown file formateps rbp__combined2012.eps r1 manage 25.3 K 2012-11-11 - 07:37 BrianPetersen  
PDFpdf rbp__combined2012.pdf r1 manage 82.9 K 2012-11-11 - 07:37 BrianPetersen  
PNGpng rbp__combined2012.png r1 manage 34.2 K 2012-11-11 - 07:37 BrianPetersen  
Unknown file formateps rbp__forward2012.eps r1 manage 21.2 K 2014-02-26 - 15:00 TaeMinHong  
PDFpdf rbp__forward2012.pdf r1 manage 42.1 K 2014-02-26 - 14:59 TaeMinHong  
PNGpng rbp__forward2012.png r1 manage 25.9 K 2014-02-26 - 14:57 TaeMinHong  
Unknown file formateps rbp__single.eps r1 manage 22.2 K 2011-12-15 - 21:57 TaeMinHong  
PDFpdf rbp__single.pdf r1 manage 67.4 K 2011-12-15 - 21:57 TaeMinHong  
PNGpng rbp__single.png r1 manage 132.6 K 2011-12-15 - 21:57 TaeMinHong  
Unknown file formateps rbp__single2012.eps r1 manage 25.3 K 2012-11-11 - 07:35 BrianPetersen  
PDFpdf rbp__single2012.pdf r1 manage 81.1 K 2012-11-11 - 07:33 BrianPetersen  
PNGpng rbp__single2012.png r1 manage 35.3 K 2012-11-11 - 07:34 BrianPetersen  
Unknown file formateps size-red-in-derivations.eps r1 manage 94.9 K 2016-05-15 - 23:05 RyanWhite navigation payload reduction
PNGpng size-red-in-derivations.png r1 manage 272.9 K 2016-05-15 - 23:05 RyanWhite navigation payload reduction
PDFpdf stream_physics_MainBLS_2018.pdf r1 manage 28.5 K 2019-01-29 - 11:56 DanieleZanzi  
PNGpng stream_physics_MainBLS_2018.png r1 manage 75.2 K 2019-01-29 - 11:56 DanieleZanzi  
Unknown file formateps streams_209183.eps r2 r1 manage 18.9 K 2012-09-20 - 17:21 PeterRadloff  
PDFpdf streams_209183.pdf r2 r1 manage 175.3 K 2012-09-20 - 17:21 PeterRadloff  
PNGpng streams_209183.png r2 r1 manage 35.6 K 2012-09-20 - 17:21 PeterRadloff  
Unknown file formateps tau_TrigSiTrack_alg_time_RoI.eps r1 manage 9.8 K 2011-12-15 - 22:24 TaeMinHong  
PDFpdf tau_TrigSiTrack_alg_time_RoI.pdf r1 manage 27.3 K 2011-12-15 - 22:24 TaeMinHong  
PNGpng tau_TrigSiTrack_alg_time_RoI.png r1 manage 67.6 K 2011-12-15 - 22:24 TaeMinHong  
PDFpdf trp1.pdf r1 manage 300.5 K 2016-08-01 - 18:55 MarkStockton DQM+TRP screenshots
PDFpdf trp2.pdf r1 manage 300.4 K 2016-08-01 - 18:55 MarkStockton DQM+TRP screenshots
PDFpdf trp3.pdf r1 manage 299.7 K 2016-08-01 - 18:55 MarkStockton DQM+TRP screenshots
PDFpdf trp4.pdf r1 manage 300.1 K 2016-08-01 - 18:55 MarkStockton DQM+TRP screenshots
PDFpdf trp5.pdf r1 manage 300.0 K 2016-08-01 - 18:55 MarkStockton DQM+TRP screenshots
Unknown file formateps trp_overview.eps r1 manage 652.4 K 2016-08-01 - 19:47 MarkStockton DQM+TRP screenshots
PDFpdf trp_overview.pdf r1 manage 269.9 K 2016-08-01 - 18:55 MarkStockton DQM+TRP screenshots
PNGpng trp_overview.png r1 manage 293.4 K 2016-08-01 - 19:47 MarkStockton DQM+TRP screenshots
Unknown file formateps xbp__combined.eps r1 manage 16.3 K 2011-12-15 - 22:32 TaeMinHong  
PDFpdf xbp__combined.pdf r1 manage 39.5 K 2011-12-15 - 22:31 TaeMinHong  
PNGpng xbp__combined.png r1 manage 20.4 K 2011-12-15 - 22:30 TaeMinHong  
Unknown file formateps xbp__jets.eps r1 manage 15.4 K 2011-12-15 - 22:30 TaeMinHong  
PDFpdf xbp__jets.pdf r1 manage 35.5 K 2011-12-15 - 22:30 TaeMinHong  
PNGpng xbp__jets.png r1 manage 18.6 K 2011-12-15 - 22:30 TaeMinHong  
Unknown file formateps xbp__single.eps r1 manage 14.6 K 2011-12-15 - 21:53 TaeMinHong  
PDFpdf xbp__single.pdf r1 manage 30.4 K 2011-12-15 - 21:53 TaeMinHong  
PNGpng xbp__single.png r1 manage 18.7 K 2011-12-15 - 21:53 TaeMinHong  
Edit | Attach | Watch | Print version | History: r86 < r85 < r84 < r83 < r82 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r86 - 2023-05-02 - SavannaShaw
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Atlas All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback