• Non ci sono risultati.

PhD Year: 1 2 3 X 4 ≥ 4

N/A
N/A
Protected

Academic year: 2021

Condividi "PhD Year: 1 2 3 X 4 ≥ 4 "

Copied!
13
0
0

Testo completo

(1)

Annual Report

Report Period: April 2017 - April 2018

PhD Year: 1  2  3 X  4  ≥ 4 

PhD Supervisor / Working Group: Prof. Arnulf Quadt, Prof. Vincenzo Cavasinni Thesis Committee: 1. Prof. Arnulf Quadt

2. Prof. Stan Lai

3. Prof. Vincenzo Cavasinni

by

Antonio De Maria

ademari@phys.uni-goettingen.de

(2)

Contents

1 Measurement of the Higgs boson production cross section 1

1.1 Motivation and Outline . . . . 1

1.2 Events Preselection and Background estimation . . . . 1

1.3 Fake-Factor method to estimate jet faking hadronic taus . . . . 3

1.4 Signal Regions definition . . . . 5

1.5 Fit Model . . . . 7

2 Code development for xTauFramework and FitBox 7

3 Teaching 8

4 Lectures 8

5 Given Talks 8

6 Attended Schools and Conferences 9

7 OTP Shifts 10

8 Further Activities 11

9 Outreach 11

(3)

1 PhD Topic: Measurement of the Higgs boson production cross section at 13 TeV in H → τ τ →lep-had decay with the ATLAS detector

1.1 Motivation and Outline

The ATLAS and CMS experiments discovered a Higgs boson in 2012, consistent with the last missing elementary particle in the Standard Model (SM) of electroweak and strong interactions.

Several properties of this boson were measured with 7 and 8 TeV centre-of-mass energy ( √ s) proton-proton (pp) collision data delivered by the Large Hadron Collider (LHC) in 2011 and 2012, respectively. These measurements have not shown significant deviations from the SM expectations. In particular, the coupling of the Higgs boson to the fermion sector has also been established with the observation of the H → τ τ decay mode with a signal significance of 5.5 σ from the the combined ATLAS and CMS 7 and 8 TeV datasets.

Using the collision data available now at √

s = 13 TeV, the detailed program of the Higgs boson property measurements will be extended to reach a higher precision compared to 7 and 8 TeV analyses due to the expected increase of data statistics and the expected increase in the Higgs boson production cross section. The H → τ τ channel will continue to play an important role in terms of measurements of the Higgs boson couplings to τ leptons as well as measurements of the other properties of the Higgs boson, such as its charge-parity (CP) quantum numbers. In the following section, the H → τ τ → lep-had final state will be presented, where lep is referred to muon or electron produced in the leptonic τ decay.

1.2 Events Preselection and Background estimation

Events are selected in order to have exactly one light lepton (electron or muon) and at exactly one hadronic taus. Among the selected taus, the tau having higher p T (referred as leading τ ) is selected to be the τ -candidate. The requirement of having exactly on light lepton reduces the contamination of Z → ll events. At this stage of the selection, there are no requirements about the reconstruction quality of the τ (ID ) due to Fake-Factor background estimation method, which is described in the next section.

After these selections, a trigger selection is applied depending on the flavour of the selected light lepton. Moreover, a matching between the trigger and the selected lepton is performed. Actually the analysis is mainly selecting events using the lowest not prescaled Single-Lepton Trigger (SLT ) for both 2015 and 2016 datasets. Beside SLT triggers, there could be the possibility to insert also a Tau + Lepton Trigger (TLT), which is mainly used to selected events in the low lepton pt region. In this case, both the lepton and the τ -candidate are required to match the different legs of the combined trigger. The two trigger regions, SLT and TLT, are selected according the lepton transverse momentum and are mutually exclusive in order to avoid event double counting, as shown in fig. 1 .

In addition to these lepton and trigger selection criteria, the following preselection requirements are then imposed:

• lepton requirements:

– gradient isolation, this is required in order to reduce QCD contamination in signal regions;

– medium quality reconstruction;

– transverse momentum greater than a threshold specified according to the used triggers;

(4)

Figure 1: trigger scheme SLT+TLT

• τ requirements:

– medium quality reconstruction;

– |η| < 2.4 (with an exclusion in the crack region, 1.37-1.52), |q| = 1, p T > 30 GeV;

– for MC samples, the truth PDG ID of the tau candidate (τ id ), must satisfy the con- ditions |τ id | > 6 and τ id 6= 21. These conditions remove jets faking taus, which are modelled by the fake-factor method;

• the charges of the lepton and hadronic τ must have opposite sign;

• application of the b-jet veto. This veto removes events where there is at least one b-tagged jet, in order to reduce t¯ t events in the signal region;

• the transverse mass (m T ) between the lepton and the missing transverse energy (E T miss ) must be less than 70 GeV. This cut is required in order to reduce the W+jets events in the signal region.

All the backgrounds can be classified into three major categories:

• events with true lepton and τ had signatures – Z/γ , Dibosons, Top

• events where a jet fakes a τ had signature – QCD jets, W+jets, Dibosons

• Events where a light charged lepton fakes a τ had signature – Z → ll + jets

Different control regions (CRs) are built to select a chosen background inverting the requirement used to remove these backgrounds from the signal region. A summary of the different control regions is reported in the tab 1 .The control regions allow the validation of the selections per- formed to separate signal and backgrounds.

In fig. 2 some data-MC modelling cross check distributions are shown for various analysis regions

(preselection and CRs).

(5)

Control region cuts

Top enriched invert b-veto, m T > 40 GeV W enriched m T > 70 GeV

QCD enriched invert lepton isolation Table 1: CR selections

Figure 2: Modelling cross check distributions in various analysis regions

1.3 Fake-Factor method to estimate jet faking hadronic taus

The background from jets mis-identified as hadronically decaying τ leptons is a dominant back- ground for the τ lep τ had final state. It consists mostly of W+jets events, but fake τ candidates also come from multi-jet production and t¯ t. The method which is used to determine this background is a data-driven method called Fake Factor. For the purposes of this method, and anti − τ region is defined taking a τ candidate which passes all analysis requirements except for failing the medium τ identification requirement. A lower threshold on the τ identification score of 0.25 is applied, since candidates with very low score are dominated by gluon-induced jets and jets arising from pile-up, while in the signal regions jets are mainly coming from quarks. Since only one τ candidate is selected for each event, it’s possible to construct, for each signal region, a corresponding anti − τ control region, containing the events passing exactly the full selection except that the τ candidate is an anti − τ .

The estimate of the fake background, both shape and normalisation, in each signal region can

then be determined by using the data events in the corresponding anti−τ region and multiplying

them with a transfer factor, called combined fake-factor (F ) to correct for the different selection

efficiency between pass − τ and anti − τ (fig. 3). Events in the anti − τ region not corresponding

(6)

to fakes from jets are subtracted using simulated event samples:

N f akes SR = (N Data anti−τ − N M C, no jet→τ anti−τ ) × F (1) The combined fake-factor for each signal region is binned in τ p T and number of τ tracks.

Figure 3: Fake factor method application

Fake-factors depend on the quark/gluon composition of a given sample and are therefore different for each possible background source. The combined fake-factor should therefore in principle be constructed as the sum of the individual fake-factors for each relevant process, weighted by the expected fractional contribution from the considered process in the anti-τ region F = R W F W + R T op F T op + R QCD F QCD (2) However, the small background from Top-quark production do not play an important role and can be expected to have reasonably similar fake-factors to the W+jets background, with a pre- ponderance of quark-initiated jets. This can be justified by checking the MC predictions at various analysis level, which are usually less than 2%. Therefore, it’s assumed that all the pro- cesses except multi-jet production can be described using the fake-factors derived for W+jet events. Thus the combined fake-factor is given by

F = R QCD F QCD + R W F W (3)

The fraction of multi-jet events in each region, R QCD , is obtained from data as described below.

The fractional contribution from W+jets production is given by R W = 1−R QCD . The individual fake-factors F i , i = W, QCD are obtained in the dedicated W and QCD control regions as the ratio of data events in the pass − τ events over those in the anti − τ . Contributions from events where the τ is not faked by a jet are subtracted from data yields using MC simulation:

F i = N Data pass,CR

i

− N M C, no jet→τ pass,CR

i

N Data f ail,CR

i

− N M C, no jet→τ f ail,CR

i

(4) The fraction of QCD multi-jet events in each SR anti − τ region is given by

R QCD = N QCD,Data anti−τ

N SR,Data anti−τ − N SR,M C no jet→τ anti−τ

(5)

(7)

The number of QCD events in the anti − τ region, N QCD,Data anti−τ , is estimated from data by mutipliying the events in the QCD anti−τ CR with a transfer factor, called Isolation Factors (I), accounting for the difference between failing and passing the lepton isolation. Events with a true lepton in the QCD anti − τ CR are subtracted using MC. These isolation factors are calculated (separately for electrons and muons) by taking the ratio of the events passing the lepton isolation requirement over those that fail it in a dedicated control region. This control region is defined using exactly the same cuts as for the Preselection stage with the only modification that the charge sign of the τ candidate and the lepton is required to be the same (SSP region). This region definition ensures the orthogonality between the regions where the isolation factor are calculated and then subsequently applied. It’s also assumed that there is no significant difference in the isolation factors between the Preselection stage and the signal regions where they are applied.

Events with true leptons in the SSP are subtracted using MC simulation. Thus : N QCD,Data anti−τ = (N QCDCR,anti−τ

Data − N QCDCR,anti−τ

M C,true lepton ) × I (6)

I = N data iso,SSP − N M C,true lepton iso,SSP

N non−iso,SSP

data − N non−iso,SSP M C,true lepton

(7) R QCD can then be calculated in each region separately for electrons and muons, and for 1- and 3- prong τ candidates.

1.4 Signal Regions definition

To exploit signal-sensitive event topologies, two inclusive analysis categories are defined in an exclusive way. The VBF category targets events with a Higgs boson produced via vector boson fusion and it’s characterised by the presence of two high-p T jets with a large pseudorapidity separation. Although this category is dominated by VBF events, is also includes smaller contri- butions from ggF and VH production. The boosted category targets events with a boosted Higgs boson produced by the ggF mechanism. Higgs boson candidates are therefore required to have a large transverse momentum. These inclusive categories are further split into multiple signal regions to improve the sensitivity to Higgs boson prediction.

The VBF inclusive category is defined by the following requirements:

• leading jet p T > 40 GeV

• sub-leading jet p T 30 GeV

• leading and sub-leading jets must be well separated; this is achieved requiring:

– jets ∆η > 3

– jets must be in opposite hemispheres

• visible mass of the jets m jj > 400 GeV

• the minimum jet η must be less than the light lepton or hadronic τ η, and the maximum jet η must be greater than the light lepton or hadronic τ η; this requirement is usually referred as centrality

• the |∆η| and ∆R between the lepton and the τ are required to be less than 1.5 and 3.0,

respectively

(8)

The VBF category is subsequently split into two further categories called VBF Tight and VBF Loose. The VBF Tight is defined applying the following cuts on top of the VBF region definition:

• the mass of the two jets must be greater than 500 GeV.

• the p T of the sum of the four-momenta of the lepton, τ and E miss T (usually referred as Higgs p T , must be greater than 100 GeV)

For the VBF Loose categorisation, the event must pass the preselection and the VBF inclusive category, but fail the VBF Tight selections.

The Boosted inclusive category is defined by the following requirements:

• events should fail VBF inclusive requirements

• the p T of the Higgs must be greater than 100 GeV

• the |∆η| and ∆R between the lepton and the τ are required to be less than 1.5 and 2.5, respectively

The Boosted category is subsequently split into two further categories called Boosted High and Boosted Low. The Boosted High is defined applying the following cuts on top of the Boosted region definition:

• the p T of the Higgs must be greater than 140 GeV

• the ∆R between the lepton and the τ is required to be less than 1.5

For the Boost Low categorisation, the event must pass the preselection and the Boosted Inclusive category, but fail the Boost High selection.

The invariant mass of the τ τ system (which is calculated using Missing Mass Calculator (MMC) technique) is shown in fig. 4 various signal regions.

Figure 4: MMC distribution is the VBF and Boosted signal regions

(9)

1.5 Fit Model

A maximum likelihood fit is performed to extract the parameter of interest (POI):

µ = σ H × BR(H → τ τ → lep − had) σ H × BR(H → τ τ → lep − had) 

SM

(8) The likelihood is maximized on the MMC distributions in all signal regions, using also informa- tion from control regions included to constrain background normalizations. The fitting model scheme is reported in fig. 5

Figure 5: Fitting model

Actually two types of fit are performed:

• Asimov fit, where the fit is performed on an Asimov dataset built substituting data both in signal regions and control regions with MC prediction.

• Low mass fit, where the fit is performed only in the region MMC < 100 GeV and using real data both in control/signal regions.

In table 2 , the significance for the an Asimov fit performed only in the lepton-hadron final state is reported

Fit type/Category Combined Boost VBF

Asimov 2.82591 1.9333 1.9349

Table 2: Significance for standalone lephad fit

After the fit, much time is spent to scrutinize the fit results and searching for possible bugs and not expected effects. To perform such a type of tasks, I was involved in the development and maintainance of a fitting tool called FitBox which is described in the next section.

2 Code development for xTauFramework and FitBox

Beside the physics analysis, I was also involved in the development and maintenance of two

frameworks used by different sub-groups in the HLepton and TauCP groups:

(10)

• xTauFramework, a framework developed to produce flat ntuples performing event selection on DxAOD samples. As developer, some of my duties are:

– adding new tools provided by the different CP groups, giving feedback in case of errors or anomalous results (e.g. fJVT, HiggsWeightTool, METSignificanceTool, etc ...);

– updating the framework to use the latest recommendations provided by the CP groups;

– adding variables and optimising the H → τ τ → lephad specific code according the feedback provided by the other analysers.

These tasks are giving me the opportunity to stay in contact both with the different CP groups and analysis groups like LFV analysis (which is sharing the same ntuples together with the SM analysis) or TauCP groups tag and probe analyses.

• FitBox, a tool to perform fit cross checks on a given workspace produced using HistFactory.

FitBox was born initially to provide NLL scans as additional material to the fit-cross checks performed by other tools. Subsequently, the tool started to be expanded adding all the necessary fit cross-checks to be a fully independent tool:

– significance calculation

– pulls and nuisance parameters ranking plots – pre-post fit fitting variables distributions – NLL scans performed using different methods

FitBox is now regularly used by different analysis groups ( SM/BSM H → τ τ , BSM H → µµ, TauWG).

Apart from knowledge of RooStat/RooFit capabilities and their usage, the development of this tool gave/is giving me the opportunity to apply various concepts which I learned in the courses I joined in the first PhD year (profile likelihood fit, etc ...)

3 Teaching

GAUSS requirements have been already fulfilled

4 Lectures

GAUSS requirements have been already fulfilled.

List of the attended seminar:

• Mitarbeiterseminar during Summer semester 2017

• Mitarbeiterseminar during Winter semester 2017/2018

5 Given Talks

List of talks given in local meetings, collaboration meetings, workshops and conferences:

(11)

Meeting Type of Talk Number of talks AG Quadt/AG Lai Group Meet-

ing Göttingen

PhD Status report 2

Terascale Meeting Analysis Talk 1

Tau Performance/Higgs to Lep- tons Workshop

Analysis Talk 1

Corfu conference Analysis Talk 1

DPG conference Analysis Talk 1

EB meeting Analysis Review 1

Higgs Plenary Meeting Analysis Review 2

HLepton Meeting Lep-Had studies + general xTauFW updates

10

LepHad Internal meeting xTauFW + analysis update 12 Higgs Group Meeting Göttingen Analysis updates 3

xTauFW meeting code development 2

Pixel week Pixel related studies 1

Pixel DQ Workshop DQMF studies 1

TauWG meeting Tau ID SF meas. (class 3 shift) 20

Seminar in Pisa Analysis Talk 1

6 Attended Schools and Conferences

The H → τ τ in the lep-had final state analysis was presented in:

• Corfu

0

2017, held in Corfu

0

from 3/10/2017 to 9/10/2017. A talk has been presented about an overview of the analysis strategy;

• US ATLAS VBF/VBS Workshop, from 25/10/2017 to 27/10/2017. Talk has been pre- sented via vydio connection and it was mainly focusing on the importance of the VBF final state measurement in the contest of the H → τ τ and possible improvements due to specific VBF triggers;

• ATLAS Tau Performance/Higgs to Leptons Workshop, held in Munich from 23/10/2017 to 27/10/2017. The talk was presented in the "SM H → τ τ " session and it was mainly focused on various analysis improvements, like the addition of the TLT trigger, and the introduction of E T miss significance to reject Z → ll background;

• Helmholtz Alliance Terascale meeting, held in DESY from 27/11/2017 to 29/11/2017. The talk was presented in the "Higgs" session, giving am overview of the analysis and fit related strategy;

• DPG Spring conference, held in Wurzburg from 19/03/2018 to 23/03/2018. The analysis was presented in the "Higgs Boson Zerfall in Fermionen" session, and the talk was mainly focused on possible improvements for the analysis which will be performed using full Run 2 dataset.

Moreover, I was selected to join the 2017 CERN-Fermilab HCP Summer School, held at Cern from 28/08/2017 to 6/09/2017. This was a great opportunity to have a complete picture about physics at hadron colliders from theoretical predictions to proton beam generation and detector physics.

Finally a talk has been also given at the ATLAS physics workshop: Physics with 120 fb −1 which

was at CERN from 4/12/2017 to 8/12/2017. The talk has been given concerning the most recent

recommendations/development provided by the TauWG for the rel. 21.

(12)

7 OTP Shifts

Since I was based at CERN during 2017, I had the opportunity to do shifts in the ATLAS control room. Class 1 shifts include:

• ID DCS watcher. This shift is performed in the ATLAS Control Room (ACR); it consists in monitoring the Detector Control Status (DCS), understanding and localising problem- s/errors and reporting them to the various Inner Detector experts in case of emergency;

• Online DQ Shifter in the ACR. This shift consists in DQ checks during the data taking to spot detector inconsistency/problems and take immediate action in order to avoid loosing data.

Class3 OTP Shifts have been done both in the contest of the Pixel Detector and the TauWG:

• optimisation of Pixel DQ checks in the Data Quality Monitoring Framework (DQMF); this task was assigned to me given the past experience gained during the Qualification Task.

Main target will be to detect and correct errors/inconsistencies in the DQ checks which can lead to problem underestimation/confusion during Online DQ shift. Current status of the task has been showed both during the last Pixel week, held from 6/11/2017 to 9/11/2017 at CERN, and during the Pixel DQ Mini workshop, held at CERN on the 14/02/2018.

• Tau Identification efficiency scale factor measurement; this was first measurement in Run 2 in which scale factors have been given binned according to τ p T . To perform such a measurement, the entire tag and probe analysis already performed in the TauWG has been revised (background estimation, efficiency estimation through likelihood fit, etc..). Final recommendations are now available for rel 20.7 and the used procedure to get them will now be used as milestone for the next measurements.

Class 1 / Class 2 Class 3

Obligation 12.00 0.25

Done 15.74 0.67

Difference (Done-Obligatory) + 3.74 +0.42

Table 3: Shift during the period from 01/01/2017 to 31/12/2017

(13)

Figure 6: Shift during the period from 01/01/2017 to 31/12/2017

8 Further Activities

As further activity, during the past months I was supervising Francesco Lucarelli, a master student from University of Pisa who was selected in the program "Particle Physics to explore the universe" promoted by INFN. Francesco was involved in the study of the hadronic τ decays using tag and probe analysis selecting Z → τ τ decays in which one of the τ decays leptonically and the other one decays hadronically. Main focus of his studies is the validation of the new τ substructure reconstruction algorithm, which leads to identify single hadronic τ decay products (mainly charged and neutral pions). This will be a crucial step to study Higgs boson CP in the H → τ τ decay.

9 Outreach

Talking about outreach activities, I joined the Ideen Expo 17, held in Hannover from 10/06/2017

to 18/06/2017, to show/explain some experiments to the public. This was a really nice experience

and it gave me the opportunity to show part of my job to a large audience.

Riferimenti

Documenti correlati

Similarly, each time we add another level of random shortcuts to the skip list, we cut the search time roughly in half, except for a constant overhead, so after O(log n) levels,

Find the Nash and Kalai-Smorodinsky solution for the following bargaining games (in both cases the disagreement point is (0,0). What can you say if the disagreement point

e osservare che K ha interno vuoto, mentre H ha interno non vuoto (corrispondente alla palla unitaria aperta)... Cerchiamo di capire come sono fatti gli aperti

Il problema consiste nello spedire un flusso di 12 unità dal nodo s al nodo t in modo da massimizzare il profitto con profitti (nero) e capacità (rosso) associati agli archi

L’obiettivo è collegare ogni isola con ponti orizzontali o verticali in modo che abbia un numero di ponti corrispondente al numero dato e si formi un percorso che colleghi

The Restricted Quadtree Triangulation (RQT) approach presented in [Paj98a, Paj98b] is focused on large scale real- time terrain visualization. The triangulation method is based

Solution proposed by Roberto Tauraso, Dipartimento di Matematica, Universit`a di Roma “Tor Vergata”, via della Ricerca Scientifica, 00133 Roma,

[r]