|Standard Model Analysis
At a hadron collider, many interesting physics processes can result in final states with two high transverse momentum (Pt),
isolated leptons (electrons and/or muons). These include Standard Model processes, like Drell Yan, TTbar, WW, H->WW, as well
as possible New Physics processes such as Z' and SUSY.
We focus on the TTbar di-lepton final state.
We select events with two high Pt isolated leptons, missing transverse energy (MET), and at least 2 high Pt jets.
The top cross-section at the LHC is fairly large, and isolating a top signal in this mode after the application of judicious
selection cuts is not very difficult.
We then count and plot the number of events as a function of jet multiplicity and will test the background prediction in the 0
and 1 jets mulitplicity bin and extract the TTbar signal in the >= 2 jets bin.
The relatively poor missing Et resolution at CMS results in significant backgrounds from Drell Yan in the ee and mumu channels.
We therefore are focusing on the electron-muon channel whose main background is W+jets, where the W decays to a muon and we get
an electron by a jet "faking" the electron signature in our detector. We are developing the technology to extract a prediction
for the W+ jets background from the data. The idea is to measure a jet-to-lepton fake rate from QCD events, and transfer this
to W events.
We investigate the prospect for measuring the WW production cross section.
The strategy is to find two isolated leptons (electron or muon) of opposite charge with large missing transverse energy and low hadronic
activity established by a jet veto.
We explore suitable background reduction cuts and develope data-driven methods to estimate background that we expect not be
reliably modeled by Monte Carlo. The largest systematic uncertainty is found to be from the top background, which we estimate from
data using the soft muon tagging technique.
|Beyond the Standard Model Analysis
We are preparing a search for non-SM physics in the V + jets channel where V is Z,W,photon vector bosons.
This is a generic signature that could be produced by any massive object
decaying to a V boson,
so we are striving to design the search in a model independent way.
The dominant background is expected to be from higher order QCD contributions to single V boson
production, which is difficult to predict with Monte Carlo (MC).
We developed two method to predict backgrounds for new physics searches in early data at the LHC with
minimal recourse to simulation.
One key idea used for that prediction is that SM V + jets production has a nearly uniform V
rapidity distribution while decays of a high mass particle would tend to produce V bosons in the central region.
We exploit this fact and plan to use events in the data with forward V's to predict the
background in the central, signal region.
The other method use the fact that, since the main sources of
artificial MET come from the system of jets, the detector and non collision effect, we model the
instrumental reponse to the system of jets in V+jets and other effects at high MET in-situ using
the multi-jet QCD events.
We are investigating a search in large missing transverse momentum (MET) plus jets final state
topology. This has great potential for discovery of new physics involving
dark matter candidates early in LHC running if Standard Model
backgrounds can be understood.
Some Standard Model processes produce the same observable signature. High transverse momentum Z
bosons (decaying into two neutrinos) produced with high Et jets, is the main irreducible background
and the understanding of this contribution is fundamental for this search beyond the Standard Model.
We studied the strategy to estimate the contribution of invisible Z bosons from data for the
early search with the CMS detector using different control samples.
We investigate the use of photon plus jets and Wmunu plus jets which
have much higher statistics than Z mumu plus jets which has been studied previously.
We are developing two SUSY analyses based on signatures involving leptons, jets, and missing transverse
momentum. One analysis is a single-lepton study,
while the other searches for events in which the two
leptons have the same sign charge. These analyses
are designed to be simple and generic, focusing on
basic topological and kinematic properties that
typically characterize SUSY signatures.
The same-sign dilepton analysis has many similarities with the single-lepton inclusive study,
but has the benefit that the
charge correlation between the leptons tends
to suppress TTbar background for the case in which
both of the leptons are produced in the W-boson
The CMS trigger system has two main levels: Level 1, implemented in
hardware, and the High Level Trigger (HLT) implemented in
software. The Level 1 trigger uses low-level calorimeter
and muon information.
For muons, the HLT processing is performed in two steps, each of
which achieves an improvement in the momentum resolution
relative to Level 1.
The first step in the HLT muon algorithm (Level 2)
uses muon system information only; this measurement is
an improvement over Level 1 but is still rough.
The second step involves finding tracks in the silicon and linking
them with muon system tracks. This procedure (Level 3)
can achieve a high quality momentum measurement.
Muon HLT twiki
The standard muon reconstruction at CMS is based on a combined track-fit through
the inner tracker and the four muon stations.
We have designed and implemented a complementary muon reconstruction
algorithm. The basic idea of the algorithm is to extrapolate inner
detector tracks to the muon detector and look for compatible
hits in the muon chambers.
In CMS, muons reconstructed in this way are called Tracker Muons.
Tracker Muon twiki
The HLT muon algorithm uses the muon system information
to define a region in the silicon tracker/pixel detector;
the algorithm then executes the pattern recognition and track
reconstruction within this region.
The pattern recognition and track finding in the
silicon system must be fast and efficient, and it cannot be
based on reconstructing all of the hits and tracks in the
full detector, as is done in the offline software. For this reason a form of regional
tracking has been developed.
Electron reconstruction and identification require validated information from the tracker,
electromagnetic and hadronic calorimeters.
Converted photons reconstructed in the tracker can be used to obtain a
clean sample of electrons using low luminosity pp collision data
and can help validate and calibrate the response of the electromagnetic calorimeter in
very early data.
Identification (tagging) of b-quarks is vital for top quark measurements as well as
many searches for
new physics. Since these quarks generally live long enough to decay in a resolvable
without living long enough to pass outside of the tracking volume, they are usually
identifying displaced tracks or reconstructing the displaced vertex from the hadron
decay. We have
developed tools that will allow us to validate the b-tagging efficiencies and mistag rates in
early data taking.
Fireworks is a CMS event display which is specialized for the physics
studies case. This specialization allows to use a stylized rather
than 3D accurate representation when it is appropriate. Data handling
is greatly simplified by using only reconstructed information and
ideal geometry. Data is presented via graphical and textual views.
Fireworks provides an easy to use interface which
allows a physicist to concentrate only on the data in which they are
Physicists can select which events (e.g. require a high energy muon),
what data (e.g. which track list) and which items in a collection
(e.g. only high-pt tracks) to show.
The Physics Analysis Toolkit (PAT) provides the CMS-wide way for doing
analysis, from large-scale skimming down to laptop-based analysis. It
meets both requirements of user-friendliness and flexibility, by
providing an easy and configurable interface to the physics objects in a
cleaned interpretation of the event as a whole.
When the LHC is turned on, an unmatched amount of data will
stream out of the CMS detector. In order to take advantage of the
highly parallelised CMS data processing model and make the data access
easier for the Physics groups, the data will be split into Primary
Datasets, where events are grouped according to similar analysis
The original full set of triggers (about 150) has been reduced to a
core set (about 30-50), still capable of retaining the same
coverage. This makes for a simpler, more robust, and easier to maintain
startup trigger table.
An On-Shell Effective Theory (OSET) is a language for describing the
production and decay of heavy new particles without a Lagrangian --
instead of numerous vertex and coupling strengths, OSET models have
only a small number of experimentally resolvable parameters:
particle masses, cross-sections, and branching ratios. This makes them
particularly useful as preliminary characterizations of new-physics
signals anticipated at the LHC.