MC Truth Level Information on AOD's for Ahtena 11.0.42

We want to select a sample, which contains charginos and neutralinos, for the later selection cut analysis in the exclusive trilepton channel. "Exclusive" stands for a veto on hadronic activity.

Using the PdgId for selecting the signal sample

The following was found for a selected sample, where the presence of a chargino (ch-o) and a NLn-o (NL=next-to-lightest) in an event was required via a PdgId check. There are 3 possible scenarios:

  • both the ch-o and NLn-o are produced at a common vertex
  • one of the particles comes from a preceding decay of squarks
  • both particles come from preceding decays of squarks
However, there are issues with the MC info stored on AOD:
  • there are NLn-o's at the end of the decay chain which seem to decay to nothing, i.e. they have an end vertex but no decaying particles associated to it.
  • ch-o's and NLn-o's can be produced at the end vertex of a squark or a gaugino, but no other particles are produced at the same vertex.
  • ch-o's and NLn-o's are produced at a vertex without a decaying parent and without any particles produced at the same vertex.
  • there are particles with PdgId=0
The first two points could be explained by the production of bare quarks, since they are not kept track of (only once they form meson states)... The 3rd issue could be due to the fact that the primary vertex information is not stored properly, i.e. one might find another SUSY particle at the beginning of another decay chain which has no parent at its production vertex. The last point needs to be checked.

Cross Sections and Normalisation of SUX SUSY Points

The main question is whether the values given in the table in xxx are the total SUSY Xsec. In fact, Baer and Tata calculate in their book a Xsec for ch-o+NLn-o production to be in the order of 10 pb for a squark mass of approx. 500 GeV and beta=5. Check with Alan. Right now we are looking into SU3, which has ~ 60% BR of NLn-o to stau's via NLn-o --> stau + tau --> tau + Ln-o + tau. These tau's decay to e/mu with a BR of ~17%, so the leptonic signal fraction is as high als for SU2. Investigate this further...

More information on SUX SUSY Points:

Our analysis of the kinematical properties of the SU2 point:

The SU2 point is in the so-called "Focus Point" region, and is characterised by parameters: The total SUSY Xsec is: 4.9 pb We employ the following cuts on MC level:
  • presence of at least one s_chi_plus_1 & s_chi_0_2/3 pair with a common vertex
  • the chargino-neutralino vertex is required not to have any parent particle
The resulting BR is ~31.6% (3155/10000 MC events). Plan to check it with the Herwig generator. Further, will check if the signal imporves with our cuts when we run it on the full set of SUSY processes, since the saving in terms of storage space is only factor 3.

Castor

Castor workbook

Copying files from Castor to our local system

log on to t2ui02 (from ppslgen or pplxgenng only) and use
#!/bin/bash
if [ $# -lt 1 ];
then
echo "usage: $0 [file reference]"
exit 0
fi
LOCATION="gsiftp://castorgrid.cern.ch/castor/cern.ch/user/o/obrandt/HLA/"
FILE="$1"
SOURCE="$LOCATION$FILE"
TARGET="file:///userdisk3/brandt/work/atlas/$FILE"
echo "Copying file $SOURCE to $TARGET"
globus-url-copy -vb -dbg $SOURCE $TARGET

Useful Castor locations:

  • Many MC files for the csc are to be found on CERNCAF: /castor/cern.ch/grid/atlas/caf/csc11/
  • For our analysis of the SU2 kinematics we are using: /castor/cern.ch/grid/atlas/caf/csc11/csc11.005402.SU2_jimmy_susy.merge.AOD.v11004206/
  • For the main background (WZ) the dataset has been used: /castor/cern.ch/grid/atlas/caf/csc11/csc11.005987.WZ_Herwig.recon.AOD.v11004209_tid003082/ totalling 45 files a 1000 events each (Y=2-6, Z arbitrary): csc11.005987.WZ_Herwig.recon.AOD.v11004209_tid003082._0000Y.pool.root.Z --> the same as on the grid w/o the tid003082 stamp, i.e. blessed.

Useful Castor commands

  • to prevent Athena jobs for crashing while waiting for a file, on should check that the files are actually staged:
stager_qry -M /castor/.../file.root
  • move a file:
rfrename /castor/cern.ch/user///MyDirectory/MyOldFileName/castor/cern.ch///MyDirectory/MyNewFileName

Submitting jobs to LXBATCH

The command line is:

bsub -q 1nm -o MySU2_01.log MySU2_01.sh
Available queues:
  • 8nm (8 minutes)
  • 1nh (1 hour)
  • 8nh (8 hours)
  • 1nd (1 day)
  • 1nw (1 week)
Check the status:
bjobs
or alternatively with jobID as parameter for the "long format" with detailed information:
bjobs -l jobID
Kill job with
bkill jobID
possible options are switched on with -R, e.g. to select at least 500 MB in /tmp, 200 MB free memory, 400 MB swap and 1 GB pool disk space:
bsub -q 1nm -o MySU2_01.log -R "tmp>500&&mem>200&&swp>400&&pool>1000" MySU2_01.sh
. It is advisable to copy castor files to your execution directory on the batch system first, it is given by $WORKDIR alias. The space in the /tmp directory of the lxbatch node is limited to ~0.5 GB.

Add your code to CVS at CERN:

These directions are for CERN (lxplus). Adopted from Atlas/DoItYourselfEventViewTutorial

Warning Be very careful here: to create a user CVS package called MyPackage, you first need to have the files you want to import. You should have

  • No CVS directories
  • no "~" files, no cleanup.sh, or i686 etc.
Once you only have .cxx, .h, requirements, job options etc, go to the root of the package and do:
cvs import -m "Creating package"  users/$USER/MyPackage ATLAS MyPackage-00-00-00
you can see it in CVS web broser here:

http://atlas-sw.cern.ch/cgi-bin/viewcvs-atlas.cgi/users/

-- OlegBrandt - 08 Feb 2007

Edit | Attach | Watch | Print version | History: r10 < r9 < r8 < r7 < r6 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r10 - 2007-03-23 - OlegBrandt
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Sandbox All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback