May 2021
02 03 04 05 06 07 08
09 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          



Some bad runs due to a problem with Hcal HB(14,31,1) pedestal drift, that caused a bad quality of JetMET.


So, new Json, /afs/

dnt have

NEW today:
<    '140160:1-140160:217',
<    '140180:1-140180:55',
<    '140181:1-140181:13',

NOT anymore present today(and present yesterday):

>    '139779:4-139779:52',
>    '139780:4-139780:75',
>    '139781:4-139781:200',
>    '139783:4-139783:36',
>    '139784:4-139784:71',
>    '139786:4-139786:101',
>    '139788:4-139788:35',
>    '139789:4-139789:112',
>    '139790:4-139790:21',
>    '139965:209-139965:213',
>    '139966:1-139966:51',
>    '139967:1-139967:55',
>    '139968:1-139968:5',
>    '139969:1-139969:78',
>    '139971:1-139971:304',
>    '139972:1-139972:38',
>    '139973:1-139973:118',
>    '139974:1-139974:53',
>    '139975:1-139975:15',
>    '139975:17-139975:54',
>    '140058:111-140058:224',
>    '140059:1-140059:610',
>    '140059:613-140059:1160',
>    '140070:1-140070:11',
>    '140076:1-140076:60',

Today DBS says:

/EGMonitor/Run2010A-PromptReco-v4/RECO found 269 results. Run range: 137437-140182.


so run to get: Run2010A-PromptReco-v4_upto140182_ExcHCALbad_139599-140159

For Mu

/Mu/Run2010A-PromptReco-v4/RECO found 269 results. Run range: 137437-140182


so run to get: Mu_Run2010A-PromptReco-v4_upto140182_Exc139599-140159/



Chk castor status:

Twiki page to keep track of the JSON files as they become available


You can find all the list of redigi/rereco samples in these pages:


$ dbsql "find run where dataset=/MinimumBias/Commissioning10-GOODCOLL-May27thSkim_v5/RAW-RECO"

find file where dataset = /MinimumBias/Commissioning10-GOODCOLL-May27thSkim_v5/RAW-RECO and dataset.status like VALID* and run.number = 135149




script to oick int. events:

W/Z+Jets madgraph samples:

Base line W/Z changes:


A good script to hadd the files:

import FWCore.ParameterSet.Config as cms

process = cms.Process("Copy")

process.source = cms.Source("PoolSource",
  skipEvents = cms.untracked.uint32(0),
  fileNames  = cms.untracked.vstring(

process.out = cms.OutputModule( "PoolOutputModule",
  fileName = cms.untracked.string('outFile.root'),

process.outpath = cms.EndPath(process.out)


WP's cut:

Real data info:

process.p = cms.Path(
   # process.hltLevel1GTSeed* #MC->block
   # process.hltPhysicsDeclared* #MC->block



  • recoGsfElectrons_gsfElectrons__RECO : the final electrons with all characteristics, made from core electrons.
  • recoGsfElectronCores_gsfElectronCores__RECO : the core electrons are only the association of superclusters and tracks.
  • recoGsfTracks_electronGsfTracks__RECO : input GSF tracks.
  • recoElectronSeeds_electronMergedSeeds__RECO : seeds for GSF tracking.
  • recoSuperClusters_correctedHybridSuperClusters__RECO : input superclusters from barrel.
  • recoSuperClusters_correctedMulti5x5SuperClustersWithPreshower__RECO : input superclusters from endcaps.

To search for a file containing some string: grep -R strig pathofdirectory


ElectronID :


new physics PD's :


To check list of runs:

To check trigger condition:

Finding data:

Guide L1-Trigger FAQ :

The High-Level Trigger in recent CMSSW releases is structured in two different "menus", i.e. lists of trigger bit definitions. They are often referred to as "lean" menus because they are only meant to contain the main physics triggers which are essential for the start-up phase of LHC, plus some monitoring inclusive triggers, often prescaled and used only to measure efficiency of the main ones, and back-up triggers, to be activated in place of the main ones, if some sub-detector has problems and/or background rates are too high to be sustained. Express stream (ES) bits are special paths with additional requests w.r.t. a normal physics path, used to retain events with particular physics relevance at a very low rate (1-2 Hz). The two menus are named after the instantaneous LHC luminosities where they are expected to be effective: 8E29 and 1E31. General information about them can be found here.


Got the answer to stop system beep: System --> Preferences --> Sound and select the System Beep tab. Try de-selecting system beep.


A talk at:

hypernews info:

how do I know what trigger table is used for a given run ?

Three steps:

1. Go to, click on Run Summary
2. Enter run # (in "CMS RunNumber")
3. When you get the result, click on "HLT Key".


Latest data:

/MinimumBias/Commissioning10-Apr20Skim_GOODCOLL-v1/RAW-RECO (From run 132440 to run 133532) (GlobalTag: GR10_P_V4::All)

/MinimumBias/Commissioning10-GOODCOLL-v9/RAW-RECO (from run 133532) (GlobalTag: GR10_P_V4::All)

Latest MC:

/MinBias/Spring10-START3X_V26A_356ReReco-v1/GEN-SIM-RECO ( Same reco as v8 version of data) (GlobalTag: START3X_V26A::All)

Working in cmslpc:

For Data: /MinimumBias/Commissioning10-Apr20Skim_GOODCOLL-v1/RAW-RECO submit crab jobs 
crab:  544 job(s) can run on 27088744 events.
crab:  Total of 544 jobs created.
crab -submit -c Apr20Skim_GOODCOLL-v1

crab: The CRAB client will not submit more than 500 jobs.
Use the server mode or submit your jobs in smaller groups

crab -submit 500 -c Apr20Skim_GOODCOLL-v1
crab -submit 501-544 -c Apr20Skim_GOODCOLL-v1

For data /MinimumBias/Commissioning10-GOODCOLL-v9/RAW-RECO (from run 133532) (GlobalTag: GR10_P_V4::All) 
(DBS says, 2 files are not at cmslpc, T1_US_FNAL : 20402393 1043 6.1TB cff plain) and 
if I see cff, I get:   
      '/store/data/Commissioning10/MinimumBias/RAW-RECO/v9/000/134/634/24CA5893-9554-DF11-A2BD-00E08178C163.root', as first file, 
but full dataset should have:

crab:  430 job(s) can run on 20402393 events.

crab -submit -c MB_GOODCOLL-v9/

crab:  Your domain name is only local dataset will be considered
crab:  Jobs:  359,
   skipped because no sites are hosting this data

Run on MC:
 /MinBias/Spring10-START3X_V26A_356ReReco-v1/GEN-SIM-RECO  ( Same reco as v8 version of data) (GlobalTag: START3X_V26A::All)

crab:  Total of 372 jobs created.
crab -submit -c Spring10-START3X_V26A_356ReReco-v1


We want PF2PAT production in my types: I am writing here: ProductionZMuMuAnalysis


On Shift: A news from Pawel De Barbaro: "last 16 hrs cms has been running essentially uninterruped, with LHC delivering squeezed beams and MinBias rates x10 higher that what we have seen so far. after 16hrs we still have > 50% of initial luminosity. scaling rates from un-squeezed beams by x10 and using r133874, r133876, and ongoing r133877, in last 16hrs we have collected >600 inv. microbarns. this means we have practically tripled our int. luminosity, wrt to what cms had collected up to yesterday at 4pm. this puts total int.L above 1 nb-1. and the weekend is not over yet..."

Ecal and HF cleaning algorithms are already part of the 3.5.7 release. So we dont need to do that by hand ?


This is a temporary tool to calculate the integrated luminosity of CMS based on a JSON file and here is a related link to hypernews:

Spring10: Reprocessing of 7 TeV 31X Samples using 35X and New Production Requests

Status of Spring10 Monte Carlo production


During this weekend CMS recorded almost 0.1nb^-1 of data. Skims for those runs at:GOODCOLL (with runs up to 132602 but 132605 will be added soon) and the skimming requirements are defined here: skimming for good-collisions

Important to look into:

For latest 2010 data sets, search in DBS using


Adding/commiting to cvs from cmslpc:

kserver_init  (will ask username/passwd)
project CMSSW
cvs -d $CVSROOT update -Ad
cvs -d $CVSROOT add filename
cvs -d $CVSROOT commit
cvs -d $CVSROOT tag Version-0000

CRAB on slc5: SetupCrab271#Setup_on_SL5

Edit | Attach | Watch | Print version | History: r38 < r37 < r36 < r35 < r34 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r38 - 2010-07-17 - LovedeepKaurSaini
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback