Lisbon Higgs-strahlung Analysis

1 - Introduction

This page is intended to include up-to-date information related to the Higgs search analysis done by the portuguese LIP group in ATLAS.

2 - Setting up the framework

The analysis code for the WH->lnubb and ZH->llbb channels deals with D3PD's (data and full simulation) using ROOT only. The code is based on the package manager RootCore, the n-tuple reader D3PDReader, the sample manager SampleHandler and the job manager EventLoop.

The code was build using the following software tutorial twiki pages as references: SoftwareTutorialAnalyzingD3PDsInROOT, RootCore and EventLoop.

2.1 - Setting up Root

Normally we run on fermi machines at LIP (our home institute), but the code also runs on lxplus machines at CERN or any Linux or Mac OS X machine. Instructions for setting up Root and RootCore at CERN are given in the above mentioned twiki pages. Here are the instructions for fermi machines and Linux or Mac OS X personal computers:

* Fermi machines at LIP: in a bash shell just type the commands (they are the same as for lxplus)

export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
localSetupROOT

* In a MAC OS X or Linux personal computer:

source (path to...)/root/bin/thisroot.sh 

Note: the code is running on a Mac OS X 10.6.8 computer with Root 5.34/01 and gcc 4.2.1.

2.2 - Setting up RootCore

Just follow the next steps to set up the PAT's (PhysicsAnalysisTools) system for building packages outside of the Athena environment:

1 - Using a bash shell create a directory in which you want to put your RootCore installation (e.g. HiggsAnalysis)

Note: don't use your local area at LIP; use for instance /x/calo/calo02/(your user name).

2 - Check out RootCore (see in SVN the lastest tag: ii-jj-kk (e.g., RootCore-00-01-48)):

svn co svn+ssh://svn.cern.ch/reps/atlasoff/PhysicsAnalysis/D3PDTools/RootCore/tags/RootCore-ii-jj-kk RootCore
or if your username in CERN is different
svn co svn+ssh://(username)@svn.cern.ch/reps/atlasoff/PhysicsAnalysis/D3PDTools/RootCore/tags/RootCore-ii-jj-kk RootCore
Note: RootCore is independent of other physics packages (GoodRunsList, etc.), and it is expected that users use the latest version of RootCore independent of which release of physics packages they use.

3 - Let RootCore configure itself

cd RootCore
./configure
cd .. 

This will scan your local system, detect the location of your ROOT installation and configure RootCore accordingly.

4 - Set up your local environment for RootCore:

source RootCore/scripts/setup.sh

Tip, idea Every time you start a new session you need to set up your local environment for Root and RootCore: the simplest way to do it is writing a script with the following lines:
echo '>>> Setting up ROOT...'
source (path to...)/root/bin/thisroot.sh
echo '>>> Setting up RootCore-ii-jj-kk...'
source RootCore/scripts/setup.sh

2.3 - Checking out the basic packages

Since we will relie on the SampleHandler package to manage our samples and on the EventLoop package to relieve us from writing our own event loop, there are some packages that we need to check out before we start writing our analysis code. The simplest way to do this is to write the packages.txt file with the following content:

atlasoff/PhysicsAnalysis/D3PDTools/RootCore/tags
atlasoff/PhysicsAnalysis/D3PDTools/RootCoreUtils/tags
atlasoff/PhysicsAnalysis/D3PDTools/SampleHandler/tags
atlasoff/PhysicsAnalysis/D3PDTools/EventLoop/tags
atlasoff/PhysicsAnalysis/D3PDTools/MultiDraw/tags
atlasoff/PhysicsAnalysis/D3PDTools/EventLoopAlgs/tags
atlasoff/PhysicsAnalysis/D3PDTools/EventLoopGrid/tags
atlasoff/PhysicsAnalysis/AnalysisCommon/PATCore/tags

Then in the working directory (e.g. HiggsAnalysis) run the command

$ROOTCOREDIR/scripts/checkout.sh packages.txt

Tip, idea Some of LIP users have different CERN userID. RootCore is able to change users by defining the CERN_USER variable (replace by your own id)
export CERN_USER="mdacunha"

RootCore will check out the latest tag from each package in the list. Next you have to ask RootCore to locate all the packages and check their dependencies:

$ROOTCOREDIR/scripts/find_packages.sh

You will have to do this each time you check out another package or another tag. Now you have to compile all of the packages:

$ROOTCOREDIR/scripts/compile.sh

2.2 - Setting up a grid/ganga job

See the twiki related: RunningAtGridLIP

3 - D3PDReader

We will use D3PDReader to read in the data. The D3PDReader is not a centrally distributed package, instead the package gets generated from the data files to be used in the analysis: typically, one data file and one MC file in order to obtain a package with all the variables contained in the D3PD's.

The D3PDReader package is generated using Athena. We will use a lxplus machine at CERN to generate the D3PDReader package (it can also be done at LIP). Follow the next steps.

1 - There are already some files in Alberto's area at LIP. Use a fresh shell and copy one MC file:

scp /x/calo/calo02/apalma/datasamples1/mc12_8TeV.161805.Pythia8_AU2CTEQ6L1_WH125_lnubb.merge.NTUP_SMWZ.e1200_s1469_s1470_r3542_r3549_p1328_tid01094745_00/NTUP_SMWZ.01094745._000001.root.1 .
and one data file:
scp /x/calo/calo02/apalma/datasamples1/data12_8TeV.00201138.physics_Muons.merge.NTUP_SMWZ.r4065_p1278_p1328_p1329_tid01120677_00/NTUP_SMWZ.01120677._000021.root.2 .

Note: You can also create a symbolic link to these files if you are using your area in LIP.

2 - Set up Athena:

export AtlasSetup=/afs/cern.ch/atlas/software/dist/AtlasSetup
alias asetup='source $AtlasSetup/scripts/asetup.sh'
asetup AtlasPhysics,17.2.7.5.2,here

3 - Generate the D3PDReader package:

mkdir code
d3pdReadersFromFile.exe -f (PathOfTheMCfile)/NTUP_SMWZ.01094745._000001.root.1  (PathOfTheDatafile)/NTUP_SMWZ.01120677._000021.root.2 -n Event -o ./code
d3pdReaderRootCoreMaker.py -p D3PDReader ./code/*
rm -rf code

4 - Copy the D3PDReader package to the directory where you will have your analysis running (e.g. HiggsAnalysis)

scp -r D3PDReader (YourUsername)@lnlip01.lip.pt:(PathToYourWorkingDirectory)/

5 - Let RootCore locate the D3PDReader package and compile it:

$ROOTCOREDIR/scripts/find_packages.sh
$ROOTCOREDIR/scripts/compile.sh

Note: Every time you add a package it is necessary to repite the commands
$ROOTCOREDIR/scripts/find_packages.sh
$ROOTCOREDIR/scripts/compile.sh

4 - Analysis code

Now lets add our analysis package. RootCore provides us a script that creates a skeleton package (see the SoftwareTutorialAnalyzingD3PDsInROOT). Instead of creating a new package, you can check out our package from SVN:

svn co svn+ssh://svn.cern.ch/reps/atlasusr/apalma/VHiggsAnalysis/tags/VHiggsAnalysis-00-00-01 VHiggsAnalysis

You also need to check out all the physics packages needed for the analysis: add them to the list in the file packages.txt:

atlasoff/DataQuality/GoodRunsLists/tags
atlasoff/PhysicsAnalysis/AnalysisCommon/PileupReweighting/tags
atlasoff/PhysicsAnalysis/JetMissingEtID/JetSelectorTools/tags
atlasoff/Reconstruction/egamma/egammaEvent/tags
atlasoff/Reconstruction/egamma/egammaAnalysis/egammaAnalysisUtils/tags
atlasoff/PhysicsAnalysis/MuonID/MuonIDAnalysis/MuonEfficiencyCorrections/tags
atlasoff/PhysicsAnalysis/MuonID/MuonIDAnalysis/MuonIsolationCorrection/tags
atlasoff/PhysicsAnalysis/MuonID/MuonIDAnalysis/MuonMomentumCorrections/tags
atlasoff/Reconstruction/Jet/ApplyJetCalibration/tags
atlasoff/Reconstruction/Jet/JetUncertainties/tags
atlasoff/Reconstruction/Jet/JetResolution/tags
atlasoff/Reconstruction/MissingETUtility/tags

and run the command:

$ROOTCOREDIR/scripts/checkout.sh packages.txt

Now let RootCore find all the packages, and compile them

$ROOTCOREDIR/scripts/find_packages.sh
$ROOTCOREDIR/scripts/compile.sh

4.1 - Standard structure

The package VHiggsAnalysis has the following structure:

* directory cmt: file Makefile.RootCore where we add the dependences from other packages (edit the file and see variable PACKAGE_DEP)

* directory external: contains files that are provided as input to other packages (e.g., GoodRunsLists, etc.)

* directory Root: source file WHlvbbCode.cxx

* directory run: scripts to run the code (e.g., macro.cpp)

* directory VHiggsAnalysis: header file WHlvbbCode.h

4.2 - Paths for configuration and external files

Correct all the paths in the code: in files /run/macro.cpp and /Root/WHlvbbCode.cxx (inside the DefaultConfig function).

4.3 - Histograms

4.4 - Skim tree

4.5 - Good Runs Lists

The GoodRunsLists package (GRL) allows you to select events that are limited to periods that are considered good for physics. To use this package we need to tell RootCore that we will be depending on the GoodRunsLists package. This is done in the /cmt/Makefile.RootCore file, trough the variable PACKAGE_DEP:

PACKAGE_DEP = EventLoop D3PDReader GoodRunsLists

Next we add to the header file VHiggsAnalysis/WHlvbbCode.h the following includes:

#include <GoodRunsLists/TGoodRunsListReader.h>
#include <GoodRunsLists/TGoodRunsListWriter.h>

and also the definition of the following variables:

Root::TGoodRunsList inGRL, outGRL; // input and output GRL
Root::TGoodRunsListReader *reader; // reads the input GRL xml file 
Root::TGoodRunsListWriter *writer; // writes the output GRL xml file
string infileGRL; // path of the input GRL xml file
bool doGRID;

Now add the following lines to the source file VHiggsAnalysis/WHlvbbCode.cxx:

(a) in the constructor of the class:

reader = new Root::TGoodRunsListReader ();
infileGRL = "(PathToTheXMLFile)";
reader->SetXMLFile( infileGRL.c_str() );
reader->Interpret();
inGRL = reader->GetMergedGoodRunsList();
doGRID = false; // whether true or false 
  
writer = new Root::TGoodRunsListWriter ();

(b) in the setupJob method:

// create an output stream (for the grid)
EL::OutputStream out ("outFile");
job.outputAdd(out);

(c) in the execute method:

bool passGRL = false;
// apply only to data
if(!event->eventinfo.isSimulation()){
  passGRL = inGRL.HasRunLumiBlock(event->eventinfo.RunNumber(), event->eventinfo.lbn());                                     
  if(passGRL) outGRL.AddRunLumiBlock(event->eventinfo.RunNumber(), event->eventinfo.lbn());
}
if(event->eventinfo.isSimulation()) passGRL = true;

(d) in the finalize method

if(!event->eventinfo.isSimulation()){
  if(!doGRID){
    // summary content of the xml file
    outGRL.Summary(kTRUE);
      
    writer->SetGoodRunsList(outGRL);
    writer->SetFilename("outputGRL.xml");
    writer->WriteXMLFile();
  } else if(doGRID){ 
    TObjString objstring(writer->GetXMLString());
    TFile *flumi = wk()->getOutputFile("outFile");
    TDirectory *lumidir = flumi->mkdir("Lumi");
    flumi->cd("Lumi");
    objstring.Write("infolumi"); // write the TObjString with name "infolumi" in the output file
    lumidir->cd();
    lumidir->WriteObjectAny(&objstring, "TObjString", "Lumi");
  }           
}

4.6 - Pile-up reweighting

The PileupReweighting package corrects the pile-up distribution of the MC samples, so that MC and data agree. To use this package we need to tell RootCore that we will be depending on the PileupReweighting package. This is done in the /cmt/Makefile.RootCore file, trough the variable PACKAGE_DEP:

PACKAGE_DEP = EventLoop D3PDReader GoodRunsLists PileupReweighting

Next we add to the header file VHiggsAnalysis/WHlvbbCode.h the following includes:

#include <PileupReweighting/TPileupReweighting.h>

and also the definition of the following variables:

Root::TPileupReweighting *my_PileupTool;
double pileupWeight;
string dataRootFilename;
string MCRootFileName;
bool doMCConfFile;

Now add the following lines to the source file VHiggsAnalysis/WHlvbbCode.cxx:

(a) in the constructor of the class (e.g., for MC process id 161805 and 2012 data runs):

my_PileupTool = new Root::TPileupReweighting (); 
pileupWeight  = 1.; // (default)
dataRootFilename = "(PathToTheDataRootFile)/VHiggsAnalysis/external/ilumicalc_histograms_None_200841-204668.root";
MCRootFileName   = "(PathToTheMCRootFile)/VHiggsAnalysis/external/MC12_SMEW_prw_v01.root";

Tip, idea We will use the prw file from the Standard Model Electro Weak group (MC12_SMEW_prw_v01.root) that contains the configuration root files for several processes. The prw file is located on the external/ directory but it was obtained doing ckeck out from SVN
svn co svn+ssh://(YourCernUserID)@svn.cern.ch/reps/atlasphys/Physics/StandardModel/ElectroWeak/Analyses/Winter2013/Common/pileupweight ./

(b) in the initialize method:

if(!my_PileupTool) throw string("No pileup tool configured!");
my_PileupTool->UsePeriodConfig("MC12a"); // specify the necessary period assignment and mu binning
if(!doMCConfFile){
  my_PileupTool->AddConfigFile(MCRootFileName);
  my_PileupTool->AddLumiCalcFile(dataRootFilename);
  m_pileupTool->MergeMCRunNumbers(161805); // just add the process id you want, provided that it is on the dataRootFilename
  // if the overall percentage of unrepresented data is small (on the order of 0.001%):
  my_PileupTool->SetUnrepresentedDataAction(2); 
} 
// my_PileupTool->EnableDebugging(true); // uncomment to enable debugging
int isGood = my_PileupTool->Initialize();
cout << "\nIf (0) then proper initialization of the Pileup tools: " << isGood << "\n" << endl;

(c) in the execute method:

/* ------------------------------------------------------------
  Pile-up re-weighting: - apply only to Monte Carlo
                        - three cases:
                        (a) generate configuration files
                        (b) apply re-weighting
                        (c) don't apply re-weighting
------------------------------------------------------------- */
// (a) generate the configuration files
if(event->eventinfo.isSimulation() && doMCConfFile){
  // HFOR application to discard events from being included 
  // in the normalization factor of MC ( xsec * lumi / N )
  // Note: This is only for AlpGen samples
  if( event->top.hfor_type() != 4 ){
    // due to a bug in the d3pd maker we have to recalculate the average number of interactions
    // per bunch crossing
    int AverageInteractions = (event->eventinfo.lbn() == 1 &&
                              int(event->eventinfo.averageIntPerXing() + 0.5) == 1) ? 0. : event->eventinfo.averageIntPerXing();
    my_PileupTool->Fill(event->eventinfo.RunNumber(), 
                        event->eventinfo.mc_channel_number(), 
                        event->mcevt[0].weight()[0], 
                        AverageInteractions);    
  }
return EL::StatusCode::SUCCESS; // go to the next method...		
}	
  
if( !event->eventinfo.isSimulation() && doMCConfFile ) return EL::StatusCode::SUCCESS; // go to the next method...
		
// (b) if we have already the configuration files for pile-up tool
if( event->eventinfo.isSimulation() && !doMCConfFile ) pileupWeight = LumiWeight();
  
// (c) don't apply pile-up re-weighting
if( !event->eventinfo.isSimulation() || !applyPileupReweighting ) pileupWeight = 1.;

(c.1) add the function LumiWeight:

double WHlvbbCode :: LumiWeight () {
  double weight = 1.; 
  if(event->eventinfo.isSimulation() && applyPileupReweighting && !doMCConfFile){
    weight = my_PileupTool->GetCombinedWeight(event->eventinfo.RunNumber(), 
                                              event->eventinfo.mc_channel_number(), 
					      event->eventinfo.averageIntPerXing() );
  }
  if(!event->eventinfo.isSimulation() || !applyPileupReweighting || doMCConfFile) weight = 1.;  
  return weight;
}

(d) in the finalize method:

if(doMCConfFile){
  if(doGRID){
    TFile *f = wk()->getOutputFile("outFile");
    my_PileupTool->WriteToFile(f);
  } else my_PileupTool->WriteToFile("mcConfFile.root");
}

4.7 - Jet selector

Edit | Attach | Watch | Print version | History: r23 < r22 < r21 < r20 < r19 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r23 - 2013-03-05 - MarioSousa
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback