The purpose of this twiki page is to gather useful links and information about a preliminary study of the decay H-->Zgamma. This is not intended to be in use long term.

Using the package

The package assumes that you are using Contact Aidan to setup an account. smuhpc (and smuhpc2) have access to all the data and MC, a condor cluster, and large amounts of disk space available.

Create a working directory and setup ATLAS: asetup,here

Then check out the package: svn co svn+ssh://

The README file contains information about getting started with the package.

To use condor add the following line to your .bashrc: source /grid/condor/


The code is written in python with an interface for the C++ modules. The analysis is steered using, where the task option determines what type of analysis should be performed. The other main modules are,,,,,, and In addition to these, each object has its own module.

The command line options for are:

Short option Long option Description
-b   Start ROOT in batch mode.
-t --task Which task to run:
-t skim to create a skim.
-t condor to create condor submissions scripts for a skim.
-t clean to recreate the pyrootmagic module (see below.)
-s --sample The sample number to run over.
-n --sample-name The sample name to run over. The sample used is the first that matches the regex.
-j --job If a task has several jobs (eg a skim) then which job to run.
-e --nevents Maximum number of events to run over. (Leave blank for all events.)
  --nocut Do not apply any cuts.
  --file-prefix A prefix for output files when making a skim.

To add a new task to simply go the end of the list of tasks and create a new block.

Reconstruction modes

To keep track of which reconstruction we use, we have the following modes:

Reconstruction mode Description
1 Electrons
2 Muons (mu_staco,mu_staco)
3 Muons (mu_staco,mu_muid)
4 Muons (mu_muid,mu_staco)
5 Muons (mu_muid,mu_muid)

Each MC and data sample is handled using the sample object. This object is aware of the source of the (unskimmed) files, the number of events for the sample, the number of files, the cross section (for MC) and whether it is MC or data. The samples are specified in the sample.samples list. In the user can pass a sample index (using the -s command line option) or sample name (using the -n command line option) to find a sample. will save this information to the variables sample_index. In the case that a sample name is specified this is treated as a regex and the first sample name that matches is used.

In addition the samples are grouped into types and each type has its own style. The types are:

Type Description Color
signal H->Z(ll)gamma, where l=e,mu kRed-9
ZllGamma Enriched Z(llgamma)+jets, where l=e,mu kOrange-9
ZeeGamma Enriched Z(ee)+jets kOrange-9
ZmmGamma Enriched Z(mumu)+jets kOrange-9
ttbar ttbar kYellow-9
qcd JF70 QCD kGreen-9
WZ W+Z kCyan-9
ZttJets generic Z(tautau)+jets kBlue-9
ZeeJets generic Z(ee)+jets kBlue-9
ZmmJets generic Z(mumu)+jets kBlue-9
ZllJets generic Z(ll)+jets, where l=e,mu kBlue-9
data11_7TeV Data kBlack data points

Each type has a function for applying style to a histogram: SetHistogramStyle( histogram )

There are separate constructors for MC and data samples:

class sample:
    def __init__(self, name, MCID, crossSection, nEvents, type, input_path):
class MCSample(sample):
    def __init__(self, name, MCID, crossSection, nEvents, type, MCTag, input_path):
class dataSample(sample):
    def __init__(self, name, runNumber, nEvents, type, input_path):

The stream of the sample is set to either eGamma, Muons or MC by matching the sample name, using the function setStream().

Each sample object has a TChain associated with it, which is accessed by sample.chain. By default this is set to 0 (null pointer). You can add the preskim files to the TChain by using the AddFiles() function. You can add the skimmed files using the AddSkimmedFiles() function (this assumes the skimmed files have been hadded successfully). When skimming the default output directory is output/skimmed.

Each MC sample has a luminosity associated with it (use the Lumi() function) and can be scaled to a given luminosity using the GetScale( lumi ) function, where lumi is given in pb.

To load a sample and obtain its skimmed TChain use the following:

s = sample.samples[sample_index]
ch = s.chain
ch will now refer to the pointer to the TChain and can be used as you would normally use it in ROOT.

pyrootmagic is a package developed by Emanuel Strauss (SLAC) that allows python to easily write TTrees. A list of variables that will be added in the skim is included in You can add a new variable using the following syntax:

Syntax Meaning
I Integer branch
D Double branch
O Boolean branch
X: vector branch

After changing the properties of the list of variables the object must be rebuilt using the command ./ -b -t clean.

The pyrootmagic object variables are accessed directly using either pyrootmagic.var_1 = X or pyrootmagic.var_2.push_back( Y ), where the first syntax is for single valued variables and the second syntax is for a vector.

The main variables are stored in Each variable has a range, a binning, a unit, x- and y-axes (where the y-axis is automatically set to reflect the binning and units) and a ROOT expression for drawing. Using these parameters the variable object can also create a histogram. In addition to these basic parameters the variable object also has a mode (ee or mm), a number of decimal places for the unit (default=2), and excluded regions for placing cuts. There is also a 2D variable class to allow the plotting of one variable against another.

The syntax for creating a new variable is:

def __init__(self,name,branch_name,xaxis,units,mode):

eg: vars['e1_pt'] = variable.MakeVar('e1_pt', 'el_GSF_pt[e1_index[best_ee_index]]*1e-3', 'p_{T}(e_{1})', 'GeV', 200,    0,  200 , 'ee')

Cuts should be added using the exclusions functions. To add cuts, pass a list of excluded regions.

eg: vars['e1_pt'].AddExclusion(0,10) to exclude the region 0<e1pt<10GeV

You can define a list of variables using the var_collection object. The syntax for creating a var_collection object is:

collection = variable.var_collection([], 'ee', 'mainListOfVars_ee')

where the first argument is a list of variables, the second argument is the mode (either ee or mm) and the final argument is a (unique) name to make sure that histogram names do not clash.

This object has methods for creating a series of cutflow plots (in the order that the variables were added) and (n-1) plots.

For example:

collection = variable.var_collection([], 'ee', 'mainListOfVars_ee')
collection.add_var_with_exclusions(variable.vars['e1ph_DR'           ], [[0.0 , 0.7]])
collection.add_var_with_exclusions(variable.vars['e2ph_DR'           ], [[0.0 , 1.0]])
collection.add_var_with_exclusions(variable.vars['e1_pt'             ], [[0,40]])
collection.add_var_with_exclusions(variable.vars['e2_pt'             ], [[0,20]])
collection.add_var_with_exclusions(variable.vars['Zee_m'             ], [[0.0,91.2-10.0],[91.2+10.0,1e30]])
collection.add_var_with_exclusions(variable.vars['Hee_m'             ], [])

import style
legend = style.legend(0.6, 0.5, 0.8, 0.9)

legend.AddEntry( types['signal'     ].histogram, types['signal'     ].title, 'f')
legend.AddEntry( types['ZllGamma'   ].histogram, types['ZllGamma'   ].title, 'f')
legend.AddEntry( types['ttbar'      ].histogram, types['ttbar'      ].title, 'f')
legend.AddEntry( types['qcd'        ].histogram, types['qcd'        ].title, 'f')
legend.AddEntry( types['WZ'         ].histogram, types['WZ'         ].title, 'f')
legend.AddEntry( types['ZttJets'    ].histogram, types['ZttJets'    ].title, 'f')
legend.AddEntry( types['ZllJets'    ].histogram, types['ZllJets'    ].title, 'f')

histogram_signal = ROOT.TH1F('h_signal','',100,0,1)
legend.AddEntry(histogram_signal, 'Signal template', 'l')

for s in sample.samples:
    if 'data' in
    collection.perform_cutflow        (s, 1.01e3, 'cutflow_test_', file_out)
collection.make_cutflow_plots  (file_out,legend)

for s in sample.samples:
    if 'data' in
    collection.perform_nMinusOne_plots(s, lumi, 'cutflow_test_', file_out)
    collection.make_nMinusOne_plots(file_out, legend)
    collection.clean_nMinusOne() is used to truthmatch the candidates in an event. It defines the following variables:

Variable Description
real_l1 True: The leading lepton candidate is a real electron/muon
False: The leading lepton candidate is a fake electron/muon
real_l2 True: The subleading lepton candidate is a real electron/muon
False: The subleading lepton candidate is a fake electron/muon
real_ph True: The photon candidate is a real photon
False: The photon candidate is a fake photon
l1_from_Z True: The leading lepton candidate is a real electron/muon and its parent is a real Z boson
l2_from_Z True: The subleading lepton candidate is a real electron/muon and its parent is a real Z boson
same_Z True: Both lepton candidates are real electrons/muons, and their parents are the same real Z boson
Z_from_H True: same_Z is True and the Z comes from a real Higgs boson
ph_from_H True: The photon candidate is a real photon, and it from a real Higgs boson
same_H Z_from_H is True, ph_from_H is True, and they both come from the same Higgs boson
ph_parent_pdg The PDG ID of the parent of the photon candidate
ph_parent_index The MC index of the parent of the photon candidate
ph_grandparent_pdg The PDG ID of the parent of the parent of the photon candidate
ph_grandparent_index The MC index of the parent of the parent of the photon candidate

When truthmatching the mode must be passed to the truthmatching module, where the muon blocks must be specified for muons. also finds the indices for the real leptons, Z, photon and Higgs, if they exist. (Indices are set to -1 otherwise.)

The module also contains some high level functions for navigating the MC block.

Particle modules

Each particle (electron, muon, photon, Z boson, Higgs boson) has its own module that manages that particle's properties. Each particle has a p4, and where the particle has children the children are also stored. Cuts relating to the particle should be stored in the relevant particle module. For example the Zboson module reads:

DeltaRCut = 0.2

class Zboson:
    def __init__(self,l1,l2):
      self.l1 = l1
      self.l2 = l2
      self.p4 = self.l1.p4 + self.l2.p4
    def pass_cuts(self, options):
        if options.nocut:
            return True
        if self.p4.M() < 10e3:
            return False
        if self.l1.p4.DeltaR( self.l2.p4) < DeltaRCut:
            return False
        return True

Each pass_cuts function must be aware of the variable options.nocut so that it can return True when the user requests that no cuts be applied.

The muon module contains more information than other modules because it must store the kinematic and isolation variables for ease of acces, and choose the correct block on a candidate by candidate basis.

You can define different kinds of selection criteria for the particles, depending on where you took recommendations from. You can change the definitions under the appropriate block, for exampe:

# Set definitions for different objects
electron_definition = 'HSG2'
muon_definition     = 'HSG2'
photon_definition   = 'SM_ZGamma'
jet_definition      = 'SM_ZGamma'
tau_definition      = 'tau_WG'

if nocut:
    electron_definition = 'No_Cut'
    muon_definition     = 'No_Cut'
    photon_definition   = 'No_Cut'
    jet_definition      = 'No_Cut'
    tau_definition      = 'No_Cut'

The module contains the counter object that keeps track of the numbers of candidates on a per candidate and per event basis. Each counter has its own unique name each counter contains a histogram which can be filled on a per event basis.

At the start of a task you can initialize the counter with:

counters['electron_candidates'] = counter.counter('electron_candidates',  20) (This counter has a histogram going from 0 to 20, so it can store information about electron multiplicity up to 20.)

At the start of each event loop the counter should be initialized for that event counters['electron_candidates'].startEvent()

Increment the number of candidates with counters['electron_candidates'].incrementCount()

Increment the number of events with counters['electron_candidates'].incrementEvent()

Prevent further incrementing per event with counters['electron_candidates'].finishEvent()

Print the summary of the counter at the end of the analysis with counters['electron_candidates'].printSummary()

Example analysis

As an example consider the multiplicity of Z and Higgs boson candidates per event per sample. To add this task we can do the following:

elif task=='count_candidates':
    # Create a dictionary of histograms to store information
    histogram_Hee_candidates_MC = {}
    histogram_Hmm_candidates_MC = {}
    histogram_Hee_candidates_data = ROOT.TH1F('h_Hee_candidates_data', '', 20, -0.5, 19.5)
    histogram_Hmm_candidates_data = ROOT.TH1F('h_Hmm_candidates_data', '', 20, -0.5, 19.5)
    histogram_Hee_candidates_data.GetXaxis().SetTitle('Number of Higgs (ee#gamma) candidates')
    histogram_Hee_candidates_data.GetYaxis().SetTitle('Number of events')
    histogram_Hmm_candidates_data.GetXaxis().SetTitle('Number of Higgs (#mu#mu#gamma) candidates')
    histogram_Hmm_candidates_data.GetYaxis().SetTitle('Number of events')
    # Load and loop over the samples
    import sample as sample_module
    for s in sample_module.samples:
        ch = s.chain
        # Create histogram if we're using MC
        # The stream is already set, so use that to see if we have MC or data
        if == 'MC':
            histogram_Hee_candidates_MC[] = ROOT.TH1F('h_Hee_candidates_%s'%(, '', 20, -0.5, 19.5)
            histogram_Hmm_candidates_MC[] = ROOT.TH1F('h_Hmm_candidates_%s'%(, '', 20, -0.5, 19.5)
            histogram_Hee_candidates_MC[].GetXaxis().SetTitle('Number of Higgs (ee#gamma) candidates')
            histogram_Hee_candidates_MC[].GetYaxis().SetTitle('Number of events')
            histogram_Hmm_candidates_MC[].GetXaxis().SetTitle('Number of Higgs (#mu#mu#gamma) candidates')
            histogram_Hmm_candidates_MC[].GetYaxis().SetTitle('Number of events')
        # Loop over the events
        for j in range(0,ch.GetEntries()):
            # Since we are using pyroot we already have direct access to all the branches in the TTree:
            n_Hee = len(ch.Hee_m)
            n_Hmm = len(ch.Hmm_m)
            if == 'MC':
    # Now all the histograms have been filled with the correct values
    # Set the styles of the histograms:
    for s in sample_module.samples:
    # Create a legend
    import style
    legend = style.TLegend(05, 0.9, 0.8, 0.5) # Style of this TLegend is nice
    for t in sample_module.types:
        if 'data' in t:
    # Combine results.  Only add MC to the THStack
    stack_ee = ROOT.THStack()
    stack_mm = ROOT.THStack()
    for s in sample_module.samples:
        if 'data' in
    # Grab canvas from the style, already set to the recommended size
    c = style.canvas
    # Prints results to file
    histogram_Hee_candidates_MC.Draw('pe:same') # Redraw to get axes titles
    # No need to garbage collect, python does that for us!

Then run this from the command line like this:

./ -b - count_candidates

Running the skim

When running the skim the output is by default written to output/skimmed and later hadded to output/total. To run the skim:

  • Create the necessary directories
    • mkdir output
    • mkdir output/skimmed
    • mkdir output/total
  • Create the pyrootmagic object
    • ./ -b -t clean
  • Create condor submission scripts for the skim
    • ./ -b -t condor
  • Submit the condor jobs (no more than 400 at a time, please!)
    • head -n 400 >
    • head -n 800 | tail -n 400 > etc
    • source etc
  • hadd the output to save time when loading files
    • source

This should allow you to create your own skim. Skimming takes about 5 hours for all the MC and the first fb^-1 from 2011, allowing you to perform all the necessary analyses and crosscheckeds with data and MC.


Event selection


Period Single electron Double electron Single muon Double muon
B-I EF_e20_medium EF_2e12_medium EF_mu18_MG EF_mu15_mu10_EFFS
J EF_e20_medium EF_2e12_medium EF_mu18_medium EF_mu15_mu10_EFFS_medium
K EF_e22_medium EF_2e12T_medium EF_mu18_medium EF_mu15_mu10_EFFS_medium
L-M EF_e22_medium EF_2e12T_medium EF_mu18_medium EF_mu15_mu10_EFFS_medium

For MC the trigger responses are luminosity weighted. See for details.


Use GoodRunsLists-00-00-96 and data11_7TeV.periodAllYear_HEAD_CoolRunQuery-00-04-08_All_Good.xml.

Primary vertices

At least one primary vertex with at least 3 tracks.

LAr error




  • Two muons where:
    • Either mu_staco or mu_muid
      • If a mu_staco candidate is within ΔR of 0.05 of a mu_muid candidate, remove the mu_muid candidate.
    • Both pass loose requirements (release 17)
    • Muon pT>=7GeV


  • Two electrons where:
    • Both pass loose requirements (release 17)
    • Electron pT>=7GeV


  • One photon where:
    • Passes loose requirements


Hit requirements:

cuts to apply for Staco/Muid CB+ST muons !expectBLayerHit &=124;&=124; nBLHits > 0
nPixHits + nPixelDeadSensors >0 (was 1 in 2011)
nSCTHits + nSCTDeadSensors  > 4 (was 5 in 2011)
nPixHoles + nSCTHoles < 3

n = nTRTHits + nTRTOutliers
if (0.1<&=124;&=951;&=124; < 1.9): require if n > 5 && nTRTOutliers < n*0.9
else if (&=124;&=951;&=124; < 0.1 or  &=124;&=951;&=124; ≥1.9): if n > 5 require nTRTOutliers < n*0.9
An example of the TRT + Outliers cut:
   Int_t n = (*mu_staco_nTRTHits)[mu_i] + (*mu_staco_nTRTOutliers)[mu_i];
   Double_t mu_eta = fabs((*mu_staco_eta)[mu_i]);
   Bool_t case1 = ((n > 5) && (*mu_staco_nTRTOutliers)[mu_i] < .9*n) ; //require n>5 && the = of trt outliers < .9n
   Bool_t case2 = (n > 5) ? ((*mu_staco_nTRTOutliers)[mu_i] < .9*n) : true ; //if n>5, we require the = of trt outliers < .9n
   Bool_t trt_ext = ( 0.1 < mu_eta < 1.9) ? case1 : case2 ; // if |eta| < 1.9, then case 1; if |eta|>=1.9 or |eta| <0.1 , then case 2 
cuts to apply for CaloTag muons !expectBLayerHit &=124;&=124; nBLHits > 0
nPixHits + nPixelDeadSensors > 0
nSCTHits + nSCTDeadSensors > 4
nPixHoles + nSCTHoles < 3

n = nTRTHits + nTRTOutliers
if (&=124;&=951;&=124; < 0.1): pass if n < 6 &=124;&=124;  nTRTOutliers < n*0.9
cuts to apply for StandaAlone muons Int_t mu_cscetahits=(*mu_staco_nCSCEtaHits)[mu_i];
Int_t mu_cscphihits=(*mu_staco_nCSCPhiHits)[mu_i];
Int_t mu_emhits=(*mu_staco_nMDTEMHits)[mu_i];
Int_t mu_eohits=(*mu_staco_nMDTEOHits)[mu_i];
mu_cscetahits+mu_cscphihits)>0 && mu_emhits>0 && mu_eohits>0

See the HSG2 page for the latest recommendations.


The decay chain H-->Zgamma, Z-->ll is reconstructed.

All possible candidates are considered.

In the skim Z candidates are required to have m(ll)>10GeV

MC reweighting

Cross section reweighting

Reference: (pdf) SM cross sections

Reference for diboson processes: ATLAS Z gamma etc cross sections

LHC Yellow Book for Higgs cross sections (CDS)

Pileup weighting

Pileup weighting is implemented using PhysicsAnalysis/AnalysisCommon/PileupReweighting PileupReweighting-00-02-05

Smearing and scaling

The following packages are used for smearing/scaling MC (to be implemented):

Tag svn Path
ggFReweighting-00-00-08 svn+ssh://
MuonMomentumCorrections-00-06-08 svn+ssh://
MuonEfficiencyCorrections-02-01-01 svn+ssh://
MuonIsolationCorrection-00-08 svn+ssh://
egammaAnalysisUtils-00-03-28 svn+ssh://
TrigMuonEfficiency-00-02-05 svn+ssh://
ZMassConstraint-00-00-06 svn+ssh://

MC samples

Signal samples:

The signal samples used are:

id Process Cross section (pb) nEvents (nFiles) Dataset
tag e864_s933_s946_r2302_r2300_p605 (
128986 H-->Z(ee)gamma 1.015e-3 9998 mc11_7TeV.128986.PowHegPythia_ggH125_Zgam_Zee.merge.NTUP_HSG2
128988 H-->Z(mm)gamma 1.015e-3 9998 mc11_7TeV.128988.PowHegPythia_ggH125_Zgam_Zmumu.merge.NTUP_HSG2

Background SM samples:

Cross sections can be found at HSG6 background MC page

The following samples are located on smuhpc:

id Process Cross section (pb) nEvents (nFiles) Dataset
tag e825_s1310_s1300_r2820_r2872_p768 (
105987 WZ 0.011485 249949 (13) mc11_7TeV.105987.WZ_Herwig.merge.NTUP_HSG2
tag e835_s1272_s1274_r2820_r2872_p768 (
105200 ttbar 144.12 14965993 (750) mc11_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_HSG2
tag e835_s1299_s1300_r2820_r2872_p768 (
107650 Z(ee) + 0 jets 664.1 6615302 (331) mc11_7TeV.107650.AlpgenJimmyZeeNp0_pt20.merge.NTUP_HSG2
107651 Z(ee) + 1 jet 132.99 1333903 (67) mc11_7TeV.107651.AlpgenJimmyZeeNp1_pt20.merge.NTUP_HSG2
107652 Z(ee) + 2 jets 40.226 404999 (21) mc11_7TeV.107652.AlpgenJimmyZeeNp2_pt20.merge.NTUP_HSG2
107653 Z(ee) + 3 jets 11.138 110000 (6) mc11_7TeV.107653.AlpgenJimmyZeeNp3_pt20.merge.NTUP_HSG2
107654 Z(ee) + 4 jets 2.8925 30000 (2) mc11_7TeV.107654.AlpgenJimmyZeeNp4_pt20.merge.NTUP_HSG2
107655 Z(ee) + 5 jets 0.75343 10000 (1) mc11_7TeV.107655.AlpgenJimmyZeeNp5_pt20.merge.NTUP_HSG2
107660 Z(mm) + 0 jets 663.79 6614248 (331) mc11_7TeV.107660.AlpgenJimmyZmumuNp0_pt20.merge.NTUP_HSG2
107661 Z(mm) + 1 jet 132.95 1334296 (67) mc11_7TeV.107661.AlpgenJimmyZmumuNp1_pt20.merge.NTUP_HSG2
107662 Z(mm) + 2 jets 40.375 403253 (21) mc11_7TeV.107662.AlpgenJimmyZmumuNp2_pt20.merge.NTUP_HSG2
107663 Z(mm) + 3 jets 11.133 110000 (6) mc11_7TeV.107663.AlpgenJimmyZmumuNp3_pt20.merge.NTUP_HSG2
107664 Z(mm) + 4 jets 2.8987 30000 (2) mc11_7TeV.107664.AlpgenJimmyZmumuNp4_pt20.merge.NTUP_HSG2
107665 Z(mm) + 5 jets 0.75662 10000 (1) mc11_7TeV.107665.AlpgenJimmyZmumuNp5_pt20.merge.NTUP_HSG2
107670 Z(tt) + 0 jets 662.5 10609203 (531) mc11_7TeV.107670.AlpgenJimmyZtautauNp0_pt20.merge.NTUP_HSG2
107671 Z(tt) + 1 jet 133.94 1999642 (101) mc11_7TeV.107671.AlpgenJimmyZtautauNp1_pt20.merge.NTUP_HSG2
107672 Z(tt) + 2 jets 40.295 599949 (31) mc11_7TeV.107672.AlpgenJimmyZtautauNp2_pt20.merge.NTUP_HSG2
107673 Z(tt) + 3 jets 11.029 399848 (21) mc11_7TeV.107673.AlpgenJimmyZtautauNp3_pt20.merge.NTUP_HSG2
107674 Z(tt) + 4 jets 2.804 114999 (7) mc11_7TeV.107674.AlpgenJimmyZtautauNp4_pt20.merge.NTUP_HSG2
107675 Z(tt) + 5 jets 0.78054 35000 (3) mc11_7TeV.107675.AlpgenJimmyZtautauNp5_pt20.merge.NTUP_HSG2
tag e891_s1310_s1300_r2732_r2700_p745 (
105814 QCD 3666.4 989948 (99) mc11_7TeV.105814.JF70_pythia_jet_filter.merge.NTUP_HSG2
tag e961_s1310_s1300_r2820_r2872_p768 (
109345 ttbar 144.12 1498997 (25) mc11_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_HSG2

To be added: single top WW ZZ

General links and references

Search for excited leptons with llgamma final state:

Pubic note

Internal note


Muon combined performance recommendations

egamma recommendations

H-->4l selection (Winter 2012)

MCTruthClassifier (SVNWeb)

-- AidanRandleConde - 09-Jan-2012

Edit | Attach | Watch | Print version | History: r15 < r14 < r13 < r12 < r11 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r15 - 2012-07-23 - AidanRandleConde
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback