Basic Information

Summary of MC15 Monte Carlo produced for studies of a dark matter mediator decaying to two quarks. The events are a private production at truth level (DAOD_TRUTH). Versions for production are

Task Athena Release Command
Generate Events 19.2.4.4.2,MCProd Generate_tf.py --ecmEnergy=13000. --maxEvents=1000 --runNumber=${RUNNUMBER} --firstEvent=1 --randomSeed=${SEED} --outputEVNTFile=evgen.root --jobConfig=${DSNAME}.py
DAOD_TRUTH1 20.1.8.6,AtlasDerivation,gcc48,here Reco_tf.py --inputEVNTFile evgen.root --outputDAODFile events.pool.root --reductionConf TRUTH1
DAOD_TRUTH3 20.1.8.6,AtlasDerivation,gcc48,here Reco_tf.py --inputEVNTFile evgen.root --outputDAODFile events.pool.root --reductionConf TRUTH3

Different datasets correspond to different permutations of points (masses, couplings, mode) being scanned. The naming scheme is:

MC15.${RUNNUMBER}.MGPy8EG_dm${MODEL}_${PROCESS}_mR${mR}_mDM${mDM}_gSM${gSM}_gDM${gDM}

Run number should agree with the process. The available run number/process combinations are:

Run Number Process Description
99999# dijet_Np# Production of a mediator decaying to two jets, in association with # (up to 2) extra jets (ie: p p > xi > j j #*j).
999980 dijetgamma Production of a mediator decaying to two jets, in association with one photon. (ie: p p > xi a, xi > j j).
99997# dijet_Np# Production of a mediator decaying to two jets, in association with # (up to 1) extra jets (ie: p p > xi > j j #*j).
99996# dijet_Np# Production of a mediator decaying to two jets (no FSR), in association with # (up to 2) extra jets (ie: p p > xi #*j, xi > j j).
99995# dijet_Np# Production of a mediator decaying to two jets (no FSR), in association with # (up to 1) extra jets (ie: p p > xi #*j, xi > j j).
99994# dijet_Np# Production of a mediator decaying to two jets (no FSR), in association with # extra jets (ie: p p > xi #*j, xi > j j). No merging is applied.
99993# dijetZXX Production of a mediator decaying to two jets (no FSR), in association with Z decaying into XX. #=1 means X=mu.

The following models from the Dark Matter Forum are available.

Model name in DS Model name in ATLAS Description
A dmA Axial mediator
V dmV Vector mediator
S DMScalarMed_loop Scalar mediator
PS DMPseudoscalarMed_loop Pseudo-scalar mediator

The m/g values are the masses and coupling assumed. The masses are in GeV. For the couplings, decimal point is replaced with the letter p. The variables are:

Variable Description
mR Mediator mass
mDM Dark Matter mass
gSM Coupling of the mediator to quarks
gDM Coupling of the mediator to dark matter

List of available filelists is available at /afs/cern.ch/user/k/kkrizka/public/mcgen/filelists. For each dataset, there are two files: dataset.txt and dataset.config. The first file contains a list of ROOT file with the events stored on faxbox. The second file contains two values: xsec is the cross-section in mb and nEvents the number of accepted events for normalization.

Sample Normalization

Available datasets are listed below. For normalization, the cross-section and the accepted number of events is shown. The weight to normalize to cross-section is then xsec / Nacc. Note that the number of events accepted is not the same as number of events written to the file. For more information on the normalization, see the MadGraphMatching#Adding_the_samples twiki. There is still a bit of confusion about what the correct way to configure the merging is (see Sept 17 talk), but it seems that "Npmax=1 w/o FSR at Tree Level" is best.

Running Instructions

All of the scripts to create the job options for different points and run the necessary transforms are available on SVN.
svn co svn+ssh://svn.cern.ch/reps/atlasphys-exo/Physics/Exotic/JDM/DiJetISR/Run2/Code/mcgen/trunk mcgen

Generating Job Options

The Athena flow for the generation of the samples is handled by MadGraph control job option files that are in SVN, named MadGraphControl_MGPy8EG_dijet*.py. They must be configured (ie: set mass points) by creating custom job option files. An example would be a file called MC15.999950.MGPy8EG_dmA_dijet_Np0_mR300_mDM10000_gSM0p16_gDM1p50.py with hopefully self-explanatory contents:
model       = 'dmA'
mR          = 300
mDM         = 10000
gSM         = 0.16
gDM         = 1.50
widthR      = 3.054824
merging     = True

include("MadGraphControl_MGPy8EG_dijet.py")

To create such job option files in bulk for different points, the gencondor.py script is provided. You can configure it by editing it to set the following variables on top.

Variable Description
mRs List of resonance masses to generate
mDMs List of DM masses to generate
gSMs List of Z' to quark couplings to generate
gDMs List of Z' to DM couplings to generate
models List of models to loop over
FAXBOX Directory containing already existing samples (for condor)

The Cartesian product is taken of the different lists to get all of the sample points. When running the script, the job option files are saved into the MC15 directory.

The MadGraph param card is based on MadGraph_param_card_dmA.dat (or MadGraph_param_card_ScalarDM.dat for dmS/dmPS models). It is edited by the control job options to contain the correct masses and couplings. The width is set by the DarkMatterWidthCalculator code. The default ATLAS run card is used, but with xqcut=0. One can edit the build_run_card function in the control job option script to modify other run card options.

Running on Condor

The gencondor.py script also creates a DAGMan file, gencondor.dag, that can be used to create the samples on a Condor system. It runs over all the sample points that do not already exist, with 10 seeds per point. The FAXBOX variable points to a directory with existing samples, organized using the ${dsname}/evgen.${seed}.root=/=${dsname}/DAOD_${derivation}.${seed}.pool.root naming scheme. If one of the files is missing (evgen/TRUTH1/TRUTH3), it runs through the entire chain.

The jobs are then submitted to Condor, after setting up a grid certificate, using:

condor_submit_dag condor.dag

The Condor jobs are hard-coded to copy the output to root://faxbox.usatlas.org://faxbox2/user/kkrizka/MC15/. This can be changed by editing condor_generate.sh and condor_convert.sh.

After the generation is complete, filelists can be created by running the cleanuplogs.sh and makeFilelists.sh scripts. The first sorts the generation logs that are fetched by the Condor jobs into the genlogs directory. The second then loops over the files inside genlogs and creates filelists (stored inside filelists) that can be used with xAODAnaHelpers. For the cross-section values, it takes the average of all the jobs for a given point.

Running Locally

A test sample can also be created by running locally.

./generate.sh
./convert.sh

Right now, these scripts are very rudimentary. generate.sh needs to be edited to point to the job option of interest and convert.sh to the desired output derivation. Their jobs is to setup the correct Athena version and create one file with the correct output at the step. generate.sh outputs evgen.root containing the generated events. convert.sh outputs DAOD.pool.root with the desired derivation.

-- KarolKrizka - 2015-08-17

Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r6 - 2015-12-14 - KarolKrizka
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback