Search for resonances decaying to Υ (1S) μ + μ- with 2017 and 2018 data
Analysis team
Introduction
A search for light resonances decaying to Υ (1S) μ
+ μ
- is performed with the
CMS 2017 and 2018 pp collision data. The Y(1S) is reconstructed through its decay to a pair of muons. The other muon pair could be offshell Y(1S) and has an invariant mass below the Y(1S) mass. In this analysis optimized for the high pileup Run-2 data, a new strategy based on event-mixing is chosen to minimized the pileup background and model the remaining background, which is purely data driven and procedurally blind.
Motivation
This process serves as a standard model candle in a search for a narrow resonance decaying to Υ (1S) μ
+ μ
- in the same final state. A light resonance decaying to a Υ(1S) meson and a pair of charged leptons might be the signature of a tetraquark characterized as a bound states of two b quarks and two b antiquarks.
Analysis Overview
We are search for resonance decaying to a Y(1S) plus 2 muons in the mass region between 13 and 28
GeV. Two of the muons are required to be compatible with the decay of a Y(1S) particle. In other words we are looking for 4 muons with a true reconstructed vertex which is our signal.
X→Υ(1S) μ3+μ4- → μ1+ μ2- μ3+μ4- (with a true vertex)
Regarding the background, there are 2 big categories. We want to develop event-mixing techniques to model the background for Run2 data which is intrinsically blind and data driven.
- Purely data driven & procedurally blind : Event-mixing technique has a long history in particle physics. It was used to model the combinatorial background in invariant two-particle mass spectra, taking into account detector acceptance and combinatorial issues. It’s also suitable if the expected signal peak sits on top of an also (but broader) peaking background. But we need to demonstrate if the event-mixing method is applicable to our four-muon analysis. We are using event mixing technique (data driven) to model background trigger by trigger. We will address the pileup background first and then study the physics background.
Samples
Data
This analysis based on pp collision data at a center of mass energy of 13
TeV collected in 2017 and 2018 with the
CMS detector. The collision data sets for both primary (
MuOnia) and event mixing (Zerobias) datasets are given in the following table.
Datasets 2017 Muonia (DAS ) |
Datasets 2017 Zero Bias (DAS ) |
Datasets 2018 Muonia (DAS ) |
Datasets 2018 Zero Bias (DAS ) |
/MuOnia/Run2017B-17Nov2017-v1/AOD |
/ZeroBias/Run2017B-17Nov2017-v1/AOD |
/MuOnia/Run2018A-17Sep2018-v1/AOD |
/ZeroBias/Run2018A-17Sep2018-v1/AOD |
/MuOnia/Run2017C-17Nov2017-v1/AOD |
/ZeroBias/Run2017C-17Nov2017-v1/AOD |
/MuOnia/Run2018B-17Sep2018-v1/AOD |
/ZeroBias/Run2018B-17Sep2018-v1/AOD |
/MuOnia/Run2017D-17Nov2017-v1/AOD |
/ZeroBias/Run2017D-17Nov2017-v1/AOD |
/MuOnia/Run2018C-17Sep2018-v1/AOD |
/ZeroBias/Run2018C-17Sep2018-v1/AOD |
/MuOnia/Run2017E-17Nov2017-v1/AOD |
/ZeroBias/Run2017E-17Nov2017-v1/AOD |
/MuOnia/Run2018D-17Sep2018-v1/AOD |
/ZeroBias/Run2018D-17Sep2018-v1/AOD |
/MuOnia/Run2017F-17Nov2017-v1/AOD |
/ZeroBias/Run2017F-17Nov2017-v1/AOD |
MC
Signal MC |
Light scalar decaying to Y(1S)μμ |
/H0ToUps1SMuMu_m18p5_TuneCUEP8M1_13TeV- pythia8/RunIISummer17DRStdmix-NZSFlatPU28to62_92X_upgrade2017_realistic_v10- v1/AODSIM |
Backgorund MC |
Y+pileup |
/UpsilonMuMu_UpsilonPt6_TuneCUEP8M1_13TeV-pythia8- evtgen/RunIISummer17DRStdmix-NZSFlatPU28to62_92X_upgrade2017_realistic_v10- v1/AODSIM |
|
BBbar |
/FourMuon_UpsilonInvMassCut_MSEL5_8TeV_pythia6/Summer12DR53X- PU_RD2_START53_V19F_ext1-v1/AODSIM |
|
Private MC |
BBbar , Two Virtual Photon |
Event Selection
Pre-selection
Pre-select muon candidates;
- Have to pass the "Official soft muon ID" cuts,
- pT > 2 GeV and |η| > 2.4 ,
- Global or Tracker muons.
Pre-select Dimuon candidates;
- At least two opposite charged pre-selected muons,
- m(μ+ &mu-) - 9.46 ≤ 3σ
- The two muons:
- inner track dz difference must be less than 25 cm,
- can be fitted to a common vertex with vertex probability > 0.05,
- If more then one candidate, we choose with highest vertex fit probability,
- We only select one candidate which may not be optimal, but unbiased and simple.
Pre-select four muon candidates;
- At least two additional opposite charged pre-selected muons.
Signal selection
In order to avoid the pairing complexity and make the mixing method feasible, first we decide the best Y candidates in an event within 3σ of Y mass and with highest vertex probability. Afterwards we loop the rest muons and require at least one 4-muon vertex fit converge.
Background Modelling
1. Physics Background:
All the 4 muons coming from a same vertex exactly same as the signals and belong to the similar physics processes like BBbar, Two Virtual Photon, inclusive Y, are consist of our physics background.
Physics Background Modelling Strategy
Simple mixing will not work to define the physics background. Mix Y and muons among the remaining events to model the shape of the 4 muon invariant mass spectrum. We first define a small cone around the original Y flying direction. After that, when mixing, only use events with Y in this cone: δR(Ymix, Yorig) <0.3
(No constrain on Ymix pT)
The effectiveness of the mixing method is studied with using MC samples of Physics background. Four muon invariant mass spectrum can be modelled by mixing method at generator level where no detector acceptance and efficiency effects and also the method is studied at reconstruction level to check if it is still valid. The potential signal peak is smeared after mixing using official 13
TeV signal MC. After that in order to model the data background the mixing method applied to data and extract background pdf with the event-mixing method.
2. Pileup Background
Four muons combinations from multiple collisions at the same reconstructed vertex are chosen as pile-up background.
We used same procedure as the signal selection but the differences is the muons used from Zerobias event to replace the original muons for the 4-muon vertex fit.
Although we call it also “event-mixing” here we mix muons from two datasets (
MuOnia and ZeroBias) to model the pileup contribution. Where Υ(1S) from MuOnia dataset and pileup muons from ZeroBias are mixed.
Event mixing concept – scenario 1(most common)
- Υ μ → μ+μ- + μ from pileup
Mix using Zero Bias events with >= 1 muon and Vertex fit: Yμ (from MuOnia) + μ (from ZeroBias)
The 3 muons (Υμ) from the original event combined with muons from a ZeroBias event. The study shows this is the main pileup background for the triple muon trigger.
Event mixing concept – scenario 2 (less often)
- Υ → μ+μ- + 2μ from pileup
Mix using ZeroBias events with >= 2 muons and Vertex fit: Y (from MuOnia) + 2 μ (from zerobias)
The two muons (Υ) combined with muons from a Zero Bias event.
To-do List
- Rerun the 2017/2018 datasets
- First of all, the upsilon candidate has to be trigger confirmed.
- For pileup event mixing using zero bias, we need to do trigger confirmation for the muons from zero bias (answer will be still tiny).
- For muon ID for upsilon candidates, we will use double-upsilon to check soft vs loose (S/B), then decide.
- For actual event mixing, we will also do trigger confirmation. Keep an eye on the mixed background shape (to see if it changes compared to without trigger confirmation).
- Look into the shift in mixed background shape, to see if its still there with trigger confirmation. If it is, need better understanding.
- Start to draft analysis note.
Analysis Software
Overall, the analysis are done in 3 steps.
- Skim using Onia2MuMu package.
- TreeMaker using FourMuonAna package and DatasetsMixing package which reads in two datasets (MuOnia and Zerobias) as input. The pileup background tree will be produced with this package
- Drawing plots using a c++ code (physics background modeling are also done here)
1. Producing new skim ntuple for 2017 and 2018
We have two versions of code to make the ntuple. One is for two datasets mixing (DatasetsMixing) and the other one is the straight forward code to analyse MuOnia(FourMuonAna package).
Setup the CMSSW_9_4_2 version for 2017 and CMSSW_10_2_5 version for 2018 dataset and check out the following packages from git repository.
cmsrel CMSSW_9_4_2 //setup for 2017 and for 2018: cmsrel CMSSW_10_2_5
cd src
cmsenv
git clone https://github.com/zhenhu/HeavyFlavorAnalysis
scram b -j24
Before submitting jobs to the Grid, it is necessary to check the related information given in the config file such as Global tag and JSON files for datasets (e.g: for 2017 datasets: BPH_SKIM_crab_2017.py) and run the code on interactive mode over a few events to discard problems not related with CRAB.
cd HeavyFlavorAnalysis/Onia2MuMu/test/2017/
cmsRun BPH_SKIM_crab_2017.py
Submitting jobs to the Grid
In order to have the correct environment setup, we should source the environment setting has to be the following at most sites;
source /cvmfs/cms.cern.ch/crab3/crab.csh
The following multicrab command must be executed for the MuOnia and MinimumBias datasets task submission
cd HeavyFlavorAnalysis/Onia2MuMu/test/2017/
voms-proxy-init -voms cms --valid 172:00
./multicrab_2017 --crabCmd=submit //2017 MuOnia dataset skim
./multicrab_2017_MinBias --crabCmd=submit //2017 ZeroBias dataset skim
cd HeavyFlavorAnalysis/Onia2MuMu/test/2018/
voms-proxy-init -voms cms --valid 172:00
./multicrab_2018ABC --crabCmd=submit //2018ABC MuOnia dataset skim
./multicrab_2018D --crabCmd=submit //2018D MuOnia dataset skim
./multicrab_2018ABC_MinBias --crabCmd=submit //2018ABC ZeroBias dataset skim
./multicrab_2018D_MinBias --crabCmd=submit //2018D ZeroBias dataset skim
Following script will loop over all jobs within director (crabOutput2017_) and will keep resubmitting failing jobs after every 30 mints.
nohup python manageCrabTask.py -l -r -t crabOutput2017_ >& crabOutput2017_.log &
nohup python manageCrabTask.py -l -r -t crabOutput2018ABC_ >& crabOutput2018ABC_.log &
Presentation links
-- CandanDozen - 2019-07-26