Particle-Flow Workshop and Vertical Integration Meeting

News box
This workshop and vertical integration meeting will take place at CERN, October 23-24, 2008
The description below is being discussed/amended with DPG's, POG's and PAG's
The preliminary agenda of the meeting can be found here. (Click on "Timetable".)

Complete: 3

Description of the work expected to be done (starting now) and presented at the meeting.

Introduction

This conference, the precise dates and content of which are still being discussed with Physics Coordination, will host two major events:

  • a particle-flow workshop, with the results of all recent developments toward completion and data-taking readiness of the particle-flow algorithm;
  • a particle-flow vertical integration meeting, where the integration of the particle-flow tools in DPG, POG and PAG work will be demonstrated.

The goal of the conference is threefold

  1. Deliver a complete, working and optimized particle-flow reconstruction algorithm to the collaboration, with a complete event description available for the PAT analyses;
  2. Deliver a complete and unique event descrition to the PAT, and get analyses started in all PAGs with the particle flow, possible with first distributions from real data, should the LHC schedule allow it;
  3. Deliver a first draft of a CMS Note with a complete description of the particle-flow algorithm, of its performance, and its interaction with real data.

Details of how these three goals can be reached can be browsed from the timetable of the conference, or by reading what is written below. Each of the items should be handled by a dedicated working group, for which a responsible will be nominated. The names of some of the people working are indicated below on a preliminary basis, but will be updated (together with the name of the responsibles) when the organization is in place.

Particle-Flow Workshop

Electron reconstruction

The electron reconstruction in jets may benefit from a tracker-based seeding, in complement to the ECAL-based seeding already used for isolated electrons.

This electron reconstruction requires

  1. Seeding with KF tracks;
  2. Pre-identification based on tracker (mostly) and ECAL quantities;
  3. GSF trajectory building and fit of the pre-id'd tracks;
  4. Bremsstrahlung photon recovery;
  5. Final identification decision;
  6. Inclusion as Particle-Flow candidates, and interface with reco::Electrons and pat::Electrons

The first three steps are already essentially finalized, the fourth is well engaged. The goal, for the workshop, is to complete the programme and deliver electrons for the analyses (see Friday, PAG vertical integration). Note that point 5) needs the work done in the next session (electron/muon/pion identification) to be completed too.

People working: Daniele Benedetti, Michele Pioppi, Florian Beaudette, other people welcome

Electron/muons/pion identification

The use of all subdetectors in the global event reconstruction with the particle-flow algorithms provide a number of handles to identify the various species of charged particles, i.e., electrons (compatiblity between ECAL energy and tracker momentum, energy loss along the track, number of recovered Bremsstrahlung photons, cluster shapes, preshower energy deposit, HCAL energy deposit), muons (ECAL and HCAL energy deposits, link of the tracker track to hits and segments in the muon system compatible in number with geometrical expectation, ...) and pions (compatibility between ECAL+HCAL energy and track momentum, absence of Bremsstrahlung photons, absence of hits and segments in the muon system, ...).

These many handles ought to be combined in a multi-variate analysis for an optimal discrimination between electrons and muons, muons and pions, pions and electrons. The output(s) of this multivariate analysis are to be propagated to the particle-flow candidate data formats, for later use in the physics analyses, but are primarily to be used for the charged particle-flow candidate identification and energy determination.

Handles to train (or calibrate) this multivariate analysis directly on data would be welcome (using the tag-and-probe method?). Systematic studies of efficiency and fake rates are expected too.

People working: Needed !

Photon/Neutral Hadron/Charged hadron identification

The use of all subdetectors in the global event reconstruction with the particle-flow algorithms provide a number of handles to identify the origin of the various neutral particles, i.e., photons and neutral hadrons. Isolated unconverted photons and isolated non-interacting neutral hadrons are pretty easy to identify, the former being identified as an ECAL cluster, and the latter as pair of topologically linked ECAL and HCAL clusters.

The difficulty arised when these clusters get merged together, and worse, when they get merged in the energy deposits of one or several charged hadrons. The handles available to discriminate between a merged photon, a merged neutral hadron and no neutral particle at all are as follows:

  1. momentum of the charged particles,;
  2. ECAL energy;
  3. HCAL energy,
  4. distance of ECAL and HCAL clusters to track extapolation(s);
  5. ECAL cluster shapes;
  6. HCAL cluster shapes.
(The latter concept of "cluster shape" is to be determined more accurately.)

These many handles ought to be combined in a multi-variate analysis for an optimal discrimination. The output(s) of this multivariate analysis are to be propagated to the particle-flow candidate data formats, for later use in the physics analyses, but are primarily to be used for the particle-flow candidate identification and energy determination.

Handles to train (or calibrate) this multivariate analysis directly on data would be welcome. Systematic studies of efficiency and fake rates are expected too.

People working: Needed !

Pileup particles vs signal particles

The identification of pileup particles, as opposed to signal particles, is important for jet energy and angular resolution, as well as for the determination of particle (e, mu, tau, gamma) isolation.

Identifying charged particles from the signal interaction and from the different in-time pileup interactions can be done on the basis of the origin primary vertex. At this level, all the neutral particles could be assigned the signal primary vertex as origin so as to improve their angular resolution.

A refined discrimination can be done with the ECAL cluster timing, claimed to be determined with a 0.1 ns accuracy, i.e., 10 times better than the entire beam spot length (30 cm = 1 ns), at least in the endcaps. This timing could also be used to improve the link between tracks and ecal clusters, based on space coordinates for the time being. Clusters in HCAL, on the other hand, can only be assigned to a given interaction through their topological links to ECAL clusters and tracks.

A multivariate analysis, including origin vertex (for tracks), timing (for ECAL clusters), pointing (for ECAL photon/electron clusters, with the help of the preshower), topological links (for all), vicinity in space (for PF candidates) could improve the effecitiveness of this assignment. Everything is to be done in this challenging field, but the time between now and the workshop is deemed adequate to get a sensible result. A strategy to calibrate the timing on data is also requested.

People working: Colin Bernet + student + ECAL DPG, others more than welcome.

Jet reconstruction

Jet reconstruction in particle-flow crucially depends on how individual particles are reconstructed, in terms of fake rate and efficiency. The single-particle response calibration, with energy and angular resolution maps (presented earlier in the workshop) are essential ingredient for this purpose. The effectiveness of the identification between a photon and a n eutral hadron (or nothing) merged in a charged cluster is another one.

With these tools in hands, and a specific tool to associate simulated particles and energy fraction in each calorimeter cells being developed as well, systematic studies to understand efficiency and fake rate (for photons, charged hadrons and neutral hadrons) in busier and busier environement are needed, e.g., in order of complexity:

  1. Jets formed of two charged pions, getting closer and closer
  2. Jets formed with a simple combination of one or two charged particle(s), a pi0, a KOL, with a larger and larger boost
  3. Jets originating from a Pythia di-jet gun
  4. Regular QCD di-jets

From the increased understanding obtained from these tools and studies, the particle-flow algorithm could be refined, towards an optimal energy and angular resolution, from the lowest to the highest pT's.

People working: Colin Bernet, Patrick Janot, Alexandre Zabi, Curtis, others ?

MET Reconstruction

By incorporating tracks and calorimeter clusters, redundant information is used by the Particle-Flow Algorithm to improve the response and resolution of the reconstructed energy for individual particles. However, because the missing energy of an event is a globally determined quantity which requires precise symmetric cancellations, any under-counting or over-counting of energy in the event will lead to fake missing energy. Hence highly efficient particle identification with very low fake rate is required. For all these reasons, it is important to study different physics processes (with/without intrinsic MET) and to systematically list as many effects which lead to fake missing energy.

All of the work performed to improve the Particle Flow algorithm in the context of Jets will also directly improve the Missing Energy determined from Particle Flow. Additional work which needs to be studied in the context of MET includes:

  1. back-to-back particle guns with increasing multiplicity: pi's, K0L's, e+-, photons,...
  2. physics events with no intrinsic MET, such as QCD dijets
  3. physics events with intrinsic MET, such as Z(nunu)+jet(s)
  4. physics events with a high density of particles (ttbar, SUSY, etc)
  5. Scan badly reco'd events (i.e. events which are in the tails), classifying (i) fake particle reco/ID (using sim to reco particle matching); and (ii) inefficient particle reco/ID (using sim to reco particle matching)
  6. PF-MET as an uncertainty weighted average energy from each input particle
  7. effect/need/optimization of PFClustering in HF and use of long and short fibres for PID
  8. effect of different luminosities (PU conditions), and w/wo PU particle removal
  9. effect of beam halo backgrounds and cosmics on PF-MET
  10. corrections to PF-MET from residual corrections to PFJets
  11. comparison of Missing HT (MET from PFJets and isolated leptons) with Global MET

Data driven performance measurements which need to be performed:

  1. average(MET) and sigma(MET_x) as a function of average(SET) for QCD dijets
  2. sigma(MET) as a function of dimuon momentum system for Z(mumu) + Jets
  3. MET Jacobian peak in W events (single lepton tagged)
  4. MET-phi symmetry in QCD, W/Z, top events
Monte-Carlo-based performance measurements which need to be done (in addition to the above) and compared with Calorimeter-only MET:
  1. sigma(SET) as a function of true SET in event (QCD events)
  2. sigma(MET) as a function of true MET in event (W and Top events)

People involved: Patrice Verdier, Rick Cavanaugh. Volunteers welcome! Needs more people.

Simulations (Fast vs Full vs Data)

Fast simulation is an essential tool to develop particle-flow algorithms and to test their performance. It therefore must be versatile enough to deal with the detailed information requested by the particle flow reconstruction (like nuclear interaction seeding, photon conversion reconstruction, and secondary vertex fitting). It also must be tuned to the full simulation (now), the test beam data (now) and the real data (soon).

Areas of particular interest are as listed below:

  • Iterative tracking
  • Nuclear interaction, V0 and photon conversion reconstruction
  • Shower development parameterization
  • Muon reconstruction
  • Electron reconstruction
where tuned performance and strategy with data ought to be presented.

People working: Patrick Janot, Florian Beaudette, Calorimetry Task Force, Marcella Bona, Martijn Mulders, Other people welcome.

Physics Analysis Toolkit

The list of particles reconstruced by the particle flow is made available to the analysis through the PAT (Physics Analysis Toolkit).

The edm::Event contains several collections of physics objects, like electrons, jets, or taus. These collections are reconstructed independently, and can overlap. For example, an isolated electron will very often be reconstructed as a jet as well. The essential goal of the PAT is to provide the analyst with a clean global view of the event, with no double counting of the energy between the different physics objects.

In standard reconstruction, the cleaning of two given collections is realized by identifying the overlapping objects a posteriori using a delta R matching, and by keeping only one of them (REF TO THE PAT DOCUMENTATION, CONFIRM THE ALGORITHM).

Particle flow reconstruction offers the possibility to reconstruct a set of non-overlapping objects directly, starting from the list of reconstructed particles produced by the particle flow algorithm. This task is realized by PF2PAT, the software layer connecting particle flow to PAT. PF2PAT provides to the analyst:

  • an interface to particle-flow-based
    • MET reconstruction,
    • jet reconstruction,
    • tau tagging,
    • b tagging (dummy for now)
  • various particle-flow-based algorithms:
    • isolation
    • pile-up particle tagging (dummy for now)
  • selectors
    • by type
    • by pT
    • ...

The output of PF2PAT is a set of non-ovelapping collections of physics objects. which are associated to external information using the standard PAT procedures. These objects are then converted to pat::Objects, which embed the external association, and which are the input of the analysis.

Our goal is to provide a version of PAT integrating the particle flow in July, in order to leave enough time to people to make use of this tool and develop a particle-flow-based analysis in time for this conference. This first version will be based on a basic PF2PAT, with:

  • rough MET reconstruction
  • rough particle-based isolation,
  • no b-tagging

People working: Colin Bernet, Giovanni Petrucciani. Other people welcome.

Vertical Integration Meeting

HCAL DPG Integration

The particle-flow algorithm relies on the calibration of ECAL+HCAL clusters originating from single hadrons. This calibration is part of the programme of an End-to-End calibration working group set up by the spokesperson, and chaired by Michel Della Negra.

The outcomes of this working group are as follows:

  1. Use of test beam data (and possibly of cosmic run data) for calibration
  2. Tune of the simulation (fast and full) to the test beam data
  3. Design of strategy for an in-situ calibration on the data, with isolated charged particles
with specific deliverables for the particle-flow algorithm
  • Calibration maps: E = a(E,eta) + b(E,eta) E_Ecal + c(E,eta) E_HCAL
  • Energy resolution maps (same as above for sigma_E)
  • Angular resolution maps (same as above for sigma_phi and sigma_eta)
The propagation tool ought also to be revisited so as to check that it optimally links hadron tracks, ecal clusters and hcal clusters (efficiency vs purity).

People working: Michel Della Nega, Jamie Ballin, more people needed

Tracker DPG Integration

After the advent and the integration of the iterative tracking, resulting in both larger tracking efficiency and smaller fake rate, together with faster computing time, this strategy ought to be extended to more "exotic" tracks and seedings:

  1. Photon conversion reconstruction
  2. V0 reconstruction
  3. Nuclear interaction reconstruction

Until now, only the first aspect has been started with an Ecal-based seeding. (See e/gamma POG integration section.) The three kind of tracks could largely benefit from new seeding strategies (in addition to the ECAL-based seeding aimed at energetic or isolated photons), applied as successive steps of the iterative tracking, so as to benefit from the prior hit cleaning, e.g.,

  1. Out-In seeding
  2. TOB-TEC seeding
  3. Road-search seeding
  4. Electron-brem seeding

towards full integration in the tracking code.

The deliverables of this work are additional pairs (or groups) of tracks, with as low fake rate as possible while preserving a high efficiency, together with the crresponding secondary vertices, for converted photons (either primary or coming from electron Bremsstrahlung), long-lived particles (K0s, Lambda, or exotic particles), and product of hadron nuclear interactions.

In parallel with the above, a complete optimization of the first three steps of the iterative tracking should be engaged, included efficiency and purity optimization from the seeding, but also a primary vertex refitting after the first step to benefit from add'l tracks found at this level without vertex constraint.

The effect of the presence of pileup events will also have to be studied in terms of fake rate/efficiency, ofr each of the three steps.

People working: Michele Pioppi, Vincent Roberfroid, more people needed.

ECAL DPG Integration

In addition to its role for the hadron energy measurement, the ECAL is primarily designed to measure the energy of electrons and photons. While isolated/energetic electrons and photons are of primary importance for triggering and for new physics signature, a complete event description requires the energy determination of all electrons and photons, independently of their energy, pseudo-rapidity and isolation. Obvious benefits are expected for jets, MET and soft-electron b tagging.

Particle-flow clusters must therefore be calibrated for electrons and photons in the same way as they are calibrated for hadrons. The outcomes of the planned work would be as follows:

  1. Use of ECAL test beam data (and possibly of cosmic run data) for calibration
  2. Tune of the simulation (fast and full) to the test beam data
  3. Design of strategy for an in-situ calibration on the data
with specific deliverables for the particle-flow algorithm
  • Calibration maps: E = f(E_Ecal, eta, phi) with specific corrections in ECAL gaps and cracks, which electrons and photons, unlike hadrons, are sensitive too;
  • Energy resolution maps (same as above for sigma_E)
  • Angular resolution maps (same as above for sigma_phi and sigma_eta)
The propagation tool ought also to be revisited so as to check that it optimally links electron tracks, ecal clusters and hcal clusters (efficiency vs purity).

People working: Florian Beaudette, Jonathan Biteau, more people needed

CMS.JetMET POG Integration

The JetMET POG has two important responsibility, namely Jet reconstruction and MET reconstruction. The particle-flow reconstruction delivers a global description of the event with all the reconstructed particles (momentum and origin vertex) and their identification. The energy is best determined once the idenfication is performed.

Jets:

Jets can then be made out of this list of reconstructed particles. While the energy scale and the resolution is already expected to be very good at this level, the residual pT and eta dependence ought to be calibrated away for an optimal jet response. This calibration can be done in several way:

  1. A-la-CaloJet MC-based correction (independent of the jet content)
  2. Jet-content-dependent MC-based correction (with dependence on, e.g., the charged energy fraction, the electromagnetic energy fraction, the charged multiplicity)
  3. Data-driven correction with gamma+jets, Z+jets
The expected deliverables for the particle-flow jets would therefore be a complete set of correction for 1), with one or two specific jet-clustering algorithm; a proof-of-principle study for 2), showing the gains with jet-content-dependent correction; and an application to gamma+Jet correction to particle-flow jets, with either type of approach.

MET

Similarly, the vectorial MET can be determined from a direct sum of all particles px's and py's, and is expected to already give competitive performance. However, more handles can be used here too (as is done for the caloMET), with jet-energy-corrected MET, with pile-up particles identification, with em/had energy identification in HF, etc...

It is expected that the performance of the various MET determination algorithms be studied, and optimized, and that a data-driven control strategy with, e.g., Z+jets (removing the Z decay products from the event) be presented.

People Working: For jets: Arno Heister (?), more people needed For MET: Patrice Verdier (?), more people needed

b-tagging POG Integration

The determination of jet axes with particle-flow jets is expected to be much more accurate than with calo-jets (or track-jets). In addition, the optimal use of the iterative tracking strategy is such that the tracks in particle-flow jets are collected with maximal efficiency and minimal fake rate. Particle-flow jets are therefore expected to be best suited for lifetime-based tags.

Similarly, the optimal use of all CMS sub-detectors will allow (see workshop presentation) to reconstruct electrons and muons even if not isolated, and to maximally separate electrons from charged hadrons from muons in jets. Also, the better determination of the jet axis should improve the discriminating power of the lepton pT with respect to the jet axis. Particle-flow jets are therefore expected to be best suited for soft-lepton tags too.

This vertical integration meeting is therefore the first opportunity to check and optimize the b-tagging algorithm performance with particle-flow jets, whether they are lifetime or soft-lepton based.

People working: Alexander Schmidt (lifetime tags), Andrea Bocci, Marcella Bona (muon tag), Michael Findt, Daniele Benedetti (electron tag), more people welcome.

e/gamma POG integration

Electrons can be seeded either from ECAL superclusters or by (short) KF tracks from the iterative tracking. Systematic studies of complementarity (re. efficiency and fake rate) of the two reconstruction and identification concepts, both for isolated electrons and in jets, and progress towards combination, will be presented. In particular, the reconstruction and identification of soft electrons in jets, towards b-tagging studies, will have to be studied, and a strategy for efficiency determination with data will need to be designed.

Photons have vastly different defintions in the particle-flow reconstruction and in the e-gamma reconstruction. In the former, photons are simple clusters which may come from electron Bremsstrahlung, pi0 or eta decays, or hadron nuclear interactions, and are useful for the global event reconstruction. In the latter, they consist of isolated, energetic superclusters, and are most useful for the study of specific signatures. Progress towards combination ought to be presented.

For both photons and electrons, the concept of isolation should be revisited within the global event description, to include the isolation with respect to all other particle-flow particles (charged particles, other photons and electrons, neutral hadrons).

Finally, ECAL-seeded converted photons is an important topic in itself. Super-cluster seeded conversions in jets were shown to have poor efficiency (because super-clusters are biased by the jet-particle energies) and large fake rate (because of the large charge particle density). The first step towards improving the efficiency consists in using PF clusters instead of super-clusters - to exploit the full ECAL granularity. The usual elecron pre-id can then be applied to reduce the fake rate. The tracks might then be GSF'ed, bremstrahlung photons be recovered, and final electron id be applied for a further fake rate reduction. This effort is then to be combined with the tracker-seeding approach.

People working: Nancy Marinelly + photon conversion group, for the last point. More people welcome for the other points.

Muon POG Integration

Muons in particle flow can be global muons, tracker muons, standalone muons, calo muons, or even a combination of all the above. While global muons and tracker muons are already integrated in particle flow, one of the goals of the workshop would be to extend it to standalone and calo muons, with a multivariate combination of all information. Systematic studies of efficiency and fake rate, both for isolated muons and for muons in jets, will be presented. In particular, the reconstruction and identification of soft muons in jets, towards b-tagging studies, will have to be studied, and a strategy for efficiency determination with data will need to be designed.

The concept of isolation should be revisited within the global event description, to include the isolation with respect to all other particle-flow particles (charged particles, photons, neutral hadrons).

People working: Martijn Mulders, Marcella Bona, Andrea Bocci

Top PAG Integration

The ttbar final state is one of the standard candle which will be most useful for particle-flow performance determination on data, because all masses at stake are known (mtop and mW), and because it involves jets, MET, leptons, and b tagging (with or without soft lepton).

The increased ttbar selection efficiency, through the lower pT accessible with particle-flow jets, the potentially better W mass determination, the particle-flow tau reconstruction (for W->tau nu), and the related improved background rejection, should be studied and compared to similar analysis with CaloJets. Soon in the analysis, the need for calibrating particle flow jets will appear, so as to get rid of the energy-response eta dependence, and feedback to the particle-flow algorithm (e.g., for jets at large eta, etc) will be of prime importance.

Eventually, a strategy to calibrate jets, MET and b-tagging efficiency directly with ttbar events might emerge from this work.

People working: Tim Christiansen, Tuula Maaki, Andrea Giammanco, other people welcome

Elctroweak PAG integration

Electroweak single vector boson processes are an important Standard Candle for (1) establishing data driven estimation of physics object reconstruction efficiencies, (2) establishing background estimates for nearly all CMS searches for new physics beyond the Standard Model, as well as (3) Standard Model precision physics measurements like the W mass.

Up to now, only Z --> tau tau final states have been studied in the context of Particle Flow, but indeed all EWK processes and final states are of high interest to the Particle Flow and need to be studied. For example, there are several topics which can be addressed and where the Particle Flow is expected to bring improvements. The Particle Flow MET Jacobian peak for leptonic W+jets events should be compared with Calorimeter-only MET (which is currently very challenging). Using the Z --> mu mu Standard Candle, the resolution and response between Particle Flow MET and Calorimeter-only MET should be compared. The improvements to selecting final states involving electrons should be studied with Particle Flow for low pT (below 50 GeV) electrons and should be compared with the super-cluster based electrons. The W mass, which is best measured with low luminosity running conditions, can be studied using the reconstructed transverse mass MT(lepton,neutrino) determined from Particle Flow objects and compared with the calorimeter only estimation of the neutrino.

People working on this analysis: Tau final states: Lead by Giuseppe Bagliesi and the EWK Tau sub-group Other final states: No one, yet. Everyone is welcome to join!

Susy & Exotica PAG integration

Low mass Supersymmetry may be observable very soon after the LHC starts producing collisions and almost certainly within a few tens of inverse picobarns of collected data. The final states of events from supersymmetric particles contains complex decay chains, producing many jets (a few with high pT and often with heavy flavour), several leptons (typically low pT and isolated), and usually accompanied with moderate to large missing transverse energy.

As in the case of ttbar events, Particle Flow may bring several improvements to such signatures. For example, while most electrons in SUSY events are isolated, they tend to be low momentum (below 50 GeV) and are of the "showering" category (electrons with many brem photons), leading to a low purity (below 50%) SUSY selection criteria (and is worse than ATLAS). This work should study the effect which particle flow electrons have on possibly improving upon this situation compared with the super-cluster based electrons. Further, improvements to the jet and MET resolution, as well as their expected improved JES during early CMS running, should be studied and compared with the calorimeter based jets and MET.

Finally, with improved momenta resolution/response from particle flow objects, reconstruction of the exclusive SUSY decay chains, and hence better estimation of sparticle masses, should be studied and compared with the calorimeter based jets and the super-cluster based electrons.

People working on this analysis: Rick Cavanaugh, Oliver Buchmuller, and others are welcome

Higgs PAG Integration

The main discovery channels of a Standard Model Higgs Bosons at relatively high mass (m_H > 130 GeV),
Blue led 1. H -> WW* -> 2l 2nu,
Blue led 2. H to ZZ* -> 2l 2l',
provide a very clean signature through the presence of isolated leptons of the final state. An isolation with respect to the particles reconstructed in the particle flow should be defined, and studied in the context of these channels.

At low mass, the most promising discovery channel is H to gamma gamma. This channel relies mostly on the electromagnetic calorimeter, and requires an excellent calibration and monitoring of this detector. Alternatively, several other channels could contribute to the discovery of a low mass Higgs Boson, for example:
Blue led 3. t tbar H, H to b bbar,
Blue led 4. VBF (Vector Boson Fusion) H, H to tau tau.

These channels rely heavily on the tracking system. They are thus complementary to Higgs to gamma gamma, and will benefit from the improved b tagging, tau tagging and momentum resolution from the particle flow. The technologies developed for VBF H, H to tau tau directly translate to
Blue led 5. bbH/A with H/A -> tau tau,
the main discovery channel for a supersymmetric Higgs boson in case of a large tan beta.

All or some of these analyses could be repeated at the PAT level with the particle-flow gobal event description. Feedback from analysis to particle flow will then be crucial to optmize the particle-flow algorithms.

People working: Colin Bernet (on 4.), Michele Pioppi and Daniele Benedetti (on 2.), other subjects open!

CMS Note

First actions towards a CMS Note should be taken very soon, of which

  • Nomination of an editor and of sub-editors
  • Nomination on an ARC
  • Creation of an appropriate CVS directory

Review status

Reviewer/Editor and Date (copy from screen) Comments
PatrickJanot - 13 June 2008 Oops! Had forgotten the b-tagging session
PatrickJanot - 11 June 2008 Draft 0 of the meeting content

Responsible: PatrickJanot
Last reviewed by: Most recent reviewer

-- PatrickJanot - 11 Jun 2008

Edit | Attach | Watch | Print version | History: r9 < r8 < r7 < r6 < r5 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r9 - 2008-07-29 - DanieleBenedetti



 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback