Monte Carlo generators - How To (for Mc@NLO and Sherpa)

Mc@NLO

I gathered information to run Mc@NLO and generate the Matrix elements/4-vec files from this webpage. Mc@NLO_howto

Installation

To work with Mc@NLO, you need a few additional packages - Jimmy, Herwig, and parton distribution functions (LHAPDF). (I am not sure if this is a requirement but having access to CERN provided packages like the CLHEP etc might be helpful - I am saying this because, I haven't run on a standalone machine outside of CERN without AFS). You can get the Herwig package from Herwig homepage. Click on the "The Herwig source code". You will get the latest version (6.5). Download 'source code' and the corresponding include files 'HERWIG65.INC' and 'herwig6510.inc' to a directory called Herwig. To compile the HERWIG source code by saying "g77 -c herwig6510.f". This gives a herwig6510.o.

Get the Jimmy package from Jimmy homepage. Download the latest version (4.31). On how to install jimmy combined with HERWIG, you can find info here. Basically when it says include directory, it must either be the include directory in the jimmy directory after unpacking or it could be in the path you mention where to find HERWIGheaders. The latter method works one out of every two to three tries. But what I do is, I copy these two include files (HERWIG65.INC renamed to herwig65.inc and herwig6510.inc) to the include directory of jimmy and then install jimmy. For the "make install" step you need to do as 'root'. I tried to route it to another directory without 'root' privilege but it seemed to be problematic.

Get the LHAPDF package from lhapdf homepage. Download latest version but we use 5.2.3. Download to a directory called LHAPDF and then unpack. Then go into lhapdf-5.2.3 and install by saying './configure' then 'make' and then as 'root' do 'make install'.

Get the MC@NLO package from MC@NLO homepage. The important thing to be noted here is that, for ATLAS, we use mainly version 3.1. But it is not available from the MC@NLO homepage. You can find it in here. Please untar the contents into a directory called EXECUTABLE and further instructions are exactly as in Mc@NLO_howto. In the EXECUTABLE directory, please download and unpack the data.tar.gz file into a directory called PDF. (You get this zip file by clicking on the 'grid files' link in the MC@NLO homepage).

Integration

Now, in the EXECUTABLE directory:
cp MCatNLO.Script MCatNLO.Script.integration
cp MCatNLO.inputs MCatNLO.inputs.Integration
and edit the following lines in the latter one -
ECM=14000 #for LHC
HVQMASS = 175 # heavy quark mass
WMASS=80.42
WWIDTH=2.124
ZMASS=91.19
ZWIDTH=2.495
HGGMASS=120 #or your preferred value
HGGWIDTH=0.0049 #or your preferred value

IPROC=-1706    # 1706 is for ttbar, choose yours
PART1=P
PART2=P    #colliding particles

PDFGROUP=CTEQ  #choose yours
PDFSET=56

*PREFIX=TTbar_  #you can choose anything
NEVENTS=10   #enough for the integration step
BASES=ON  #will be OFF for the Event generation step
HWPATH="/home/siva/McAtNlo/Herwig"
HERWIGVER="herwig6510.o"
PDFPATH="/home/siva/McAtNlo/EXECUTABLE/PDF"
LHAPATH="/home/siva/McAtNlo/LHAPDF/lhapdf-5.2.3"

compileNLO  #for our purpose we stick to this

Then edit the file Integration.Inputfile (this is the file that gives inputs to run MCatNLO - though some of it are already entered in the MCatNLO.inputs.Integration but you can reset them here.)

"TTbar_Base"
"TTbar_Prod"
14000 1 1 1 1
-1706
175
0.32 0.32 0.5 1.5 5 0.75  ! u d s c b g
p p
cteq 56
-1
MS
10                   ! number of events
1
4178323            ! random number to be changed for generating same process multiple number of times
0.3
5 5                    !number of iterations allowed - more the better (normally we use 20 20) - only for integration

Now to do the integration, first produce the executable by compiling, which you do by saying:

./MCatNLO.inputs.Integration
This will create a Linux directory in the EXECUTABLE directory and you will find the executable there with your *PREFIX setting ie "TTbar_NLO_EXE_THISLIB"

Now copy the pdf function you want to use (for me it is cteq6m). So,

cp PDF/cteq6m.lbl cteq6m

Then, perform the intergration:

./Linux/TTbar_NLO_EXE_THISLIB < Integration.Inputfile

This will take several minutes depending on the number of iterations you want. Then save the integration results for event generation step in a directory called CTEQ6M_BASICS. You do that by doing:

mkdir ./../CTEQ6M_BASICS
cp TTbarBase_a_bs.data ./../CTEQ6M_BASICS/.
cp TTbarBase_b_bs.data ./../CTEQ6M_BASICS/.
cp TTbarBase.integrals ./../CTEQ6M_BASICS/.
cp cteq6m ./../CTEQ6M_BASICS/.

This finishes the Integration step.

Event generation

For event generation, now we do
cp MCatNLO.inputs.Integration MCatNLO.inputs.Eventgeneration
cp Integration.Inputfile Eventgeneration.Inputfile

Make the following changes to the MCatNLO.inputs.Eventgeneration

BASES=OFF
You can also change NEVENTS=100 but that is optional.

and the Eventgeneration.Inputfile is "exactly" same as Integration.Inputfile except for the last line giving the number of iterations. You set that to -

0 0

Then creating the executable and test is done in the same way as above. For that, first we remove the Linux directory.

rm -rf Linux
./MCatNLO.inputs.Eventgeneration
This will create the Linux directory and the executable in it. Now run,
./Linux/TTbar_NLO_EXE_THISLIB < Eventgeneration.Inputfile

This will be done very very quickly. Then copy a few files to the CTEQ6M_BASICS directory.

cp ./Linux/TTbar_NLO_EXE_THISLIB ./../CTEQ6M_BASICS/.
cp ./Eventgeneration.Inputfile ./../CTEQ6M_BASICS/Eventgeneration.Inputfile.BASIC

Now we generate large number of MEs/4-vec from the CTEQ6M_BASICS directory.

cd ../CTEQ6M_BASICS

Edit the Eventgeneration.Inputfile.BASIC file for these three lines:

"TTbar_Base"
XXX_Filename
14000 1 1 1 1
-1706
175
0.32 0.32 0.5 1.5 5 0.75  ! u d s c b g
p p
cteq 56
-1
MS
XXX_Nevents_Per_Job     !number of events
1
XXX_Random_Seed         !random number to be changed for generating same process multiple number of times
0.3
0 0           !number of iterations allowed - more the better (normally we use 20 20) - required only for integration

Create an output directory:

cd ..
mkdir MatrixElement
mkdir MatrixElement/InputAndLogfiles

Copy this python script file in the current directory and make the following changes.

MyFilename = "TTbar"    # will be used as XXX_Filename
base_dir     = "/home/siva/McAtNlo/CTEQ6M_BASICS"
MatrixElement_dir = "/home/siva/McAtNlo/MatrixElement"
Nevents = 5000  #Number of events per file  # will be used as XXX_Nevents_per_job
Njobs    = 10      # Number of files 

Make this script an executable by saying and run the script

chmod u+x MCatNLO_MatrixElements.py
./MCatNLO_MatrixElements.py

This should produce all the MEs/4-vec files in the MatrixElement directory. And you can then use it as input to Athena generation.

Sherpa

Installation

You can get the Sherpa package from the Sherpa homepage. Here it goes to sherpa version 1.0.10, but you can click on downloads in the left bar and choose the latest version. This page also lists the known bugs for which they provide a patch.

We download the zip file and unpack it. The way to work with sherpa is described in the howto-1.0.10.pdf file you can find in the above webpage. Once you unpack it, you need to first apply the patches by downloading them first and then doing

patch -p0 *.patch   #better to do it one by one coz I haven't tried doing all at once

Then to install Sherpa, we just do

./TOOLS/makeinstall -t

Integration

To run Sherpa, you need to first have all your process files setup. Lets say, we need to generate Zmumu+3lightjets events. First we need to create a directory in the SHERPA-1.0.10/Run/Z3ljetstomumu. This will contain the following list of files - Shower.dat, Selector.dat, Run.dat, Particle.dat, Model.dat, MI.dat, MICuts.dat, ME.dat, Lund.dat, ISR.dat, Integration.dat, Hadron.dat, Beam.dat, Analysis.dat, Processes.dat, Fragmentation.dat. All this zipped in one folder which you can use as an example can be found here.

The changes are to be made in 1) Fragmentation.dat to set the DECAYPATH=Decaydata/ instead of whatever is already there in the example file. 2) You can choose your processes in the Processes.dat file. it is pretty much self explanatory the way you do it. 3) Edit the Selector.dat (again self-explanatory). 4) Edit Run.dat for setting random number but it can also be set while running the executable which you will see in the next subsection. 5) Edit the Shower.dat, ISR.dat as per your requirements.

Further, we need to create directories named Evgen, Result, Process, Analysis inside the Z3ljetstomumu directory. These are required mainly for the Eventgeneration step but it is better to generate beforehand.

mkdir Evgen
mkdir Analysis
mkdir Result
mkdir Process

Once this is all ready, we can now run the integration step in Sherpa. For that, we just do - (this has to be done from the Run directory)

./Sherpa PATH=Z3ljetstomumu

This will crash and ask you to run 'makelibs'. This you must do from the Z3ljetstomumu directory.

cd Z3ljetstomumu
./makelibs

Event generation

Once we have the ./makelibs run successfully, we are ready for event generation. For this, we again go to the Run directory and do the following -
./Sherpa PATH=Z3ljetstomumu RESULT_DIRECTORY=Z3ljetstomumu/Result EVT_FILE_PATH=Z3ljetstomumu/Evgen SHERPA_OUTPUT=Z3ljetstomumu0001 EVENTS=10000 RANDOM_SEED=132 300 FILE_SIZE=6500 OUTPUT_PRECISION=11 ANALYSIS=1 ANALYSIS_OUTPUT=Z3ljetstomumu/Analysis/

This will take several hours depending on the number of events and will create the MEs/4-vec files in the Evgen directory which can then be used as input to generating events with Athena.

Interface between Athena and Sherpa is pretty straightforward. For MCatNLO it is not straightforward. Atleast, it did not register in my mind.

-- HalasyaSivaSubramania - 16 Aug 2007

Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r2 - 2007-08-17 - HalasyaSivaSubramania
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback