• Non ci sono risultati.

2. Prof. Stan Lai

N/A
N/A
Protected

Academic year: 2021

Condividi "2. Prof. Stan Lai"

Copied!
14
0
0

Testo completo

(1)

Annual Report

Report Period: April 2016 - April 2017

PhD Year: 1  2 X  3  4  ≥ 4 

PhD Supervisor / Working Group: Prof. Arnulf Quadt, Prof. Vincenzo Cavasinni Thesis Committee: 1. Prof. Arnulf Quadt

2. Prof. Stan Lai

3. Prof. Vincenzo Cavasinni

by

Antonio De Maria

ademari@phys.uni-goettingen.de

(2)

CONTENTS i

Contents

1 Measurement of the Higgs boson production cross section 1

1.1 Motivation and Outline . . . . 1

1.2 Events Preselection and Background estimation . . . . 1

1.3 Fake-Factor method to estimate jet faking hadronic taus . . . . 3

1.4 Signal Regions definition . . . . 4

1.5 Missing Mass Calculator retuning for Run2 analysis . . . . 5

1.6 Fit Model . . . . 6

1.7 Outlook and Plans for the Next 12 Months . . . . 6

2 Code development for xTauFramework and FitBox 6 3 Teaching 7 4 Lectures 7 5 Given Talks 8 6 Attended Schools and Conferences 8 7 OTP Shifts 8 7.1 Future OTP activities . . . . 9

8 Further Activities 11

(3)

1 PhD Topic: Measurement of the Higgs boson production cross section at 13 TeV in H → τ τ →lep-had decay with the ATLAS detector

1.1 Motivation and Outline

The ATLAS and CMS experiments discovered a Higgs boson in 2012, consistent with the last missing elementary particle in the Standard Model (SM) of electroweak and strong interactions.

Several properties of this boson were measured with 7 and 8 TeV centre-of-mass energy ( √ s) proton-proton (pp) collision data delivered by the Large Hadron Collider (LHC) in 2011 and 2012, respectively. These measurements have not shown significant deviations from the SM expectations. In particular, the coupling of the Higgs boson to the fermion sector has also been established with the observation of the H → τ τ decay mode with a signal significance of 5.5 σ from the the combined ATLAS and CMS 7 and 8 TeV datasets.

Using the collision data available now at √

s = 13 TeV, the detailed program of the Higgs boson property measurements will be extended to reach a higher precision compared to 7 and 8 TeV analyses due to the expected increase of data statistics and the expected increase in the Higgs boson production cross section. The H → τ τ channel will continue to play an important role in terms of measurements of the Higgs boson couplings to τ leptons as well as measurements of the other properties of the Higgs boson, such as its charge-parity (CP) quantum numbers. In the following section, the H → τ τ → lep-had final state will be presented, where lep is referred to muon or electron produced in the leptonic τ decay.

1.2 Events Preselection and Background estimation

Events are selected in order to have exactly one light lepton (electron or muon) and at least one hadronic taus. Among the selected taus, the tau having higher Pt (referred as leading τ ) is selected to be the τ -candidate. The requirement of having exactly on light lepton reduces the contamination of Z → ll events. At this stage of the selection, there are no requirements about the reconstruction quality of the τ (ID ) due to Fake-Factor background estimation method, which is described in the next section.

After these selections, a trigger selection is applied depending on the flavour of the selected light lepton. Moreover, a matching between the trigger and the selected lepton is performed. Actually the analysis is mainly selecting events using the lowest not prescaled Single-Lepton Trigger (SLT ) for both 2015 and 2016 datasets. Beside SLT triggers, an effort is ongoing to insert also a Tau + Lepton Trigger (TLT), which is mainly used to selected events in the low lepton pt region.

In this case, both the lepton and the τ -candidate are required to match the different legs of the combined trigger. The two trigger regions, SLT and TLT, are selected according the lepton transverse momentum and are mutually exclusive in order to avoid events double counting, as shown in fig. 1 .

In addition to these lepton and trigger selection criteria, the following preselection requirements are then imposed:

• lepton requirements:

– gradient isolation, this is required in order to reduce QCD contamination in signal regions;

– medium quality reconstruction;

– transverse momentum greater than a threshold specified according to the used triggers;

(4)

1 MEASUREMENT OF THE HIGGS BOSON PRODUCTION CROSS SECTION 2

Figure 1: trigger scheme SLT+TLT

• τ requirements:

– medium quality reconstruction;

– |η| < 2.4 (with an exclusion in the crack region, 1.37-1.52), |q| = 1, p

T

> 20 GeV;

– for MC samples, the truth PDG ID of the tau candidate (τ

id

), must satisfy the con- ditions |τ

id

| > 6 and τ

id

6= 21. These conditions remove jets faking taus, which are modelled by the fake-factor method;

• the charges of the lepton and hadronic τ must have opposite sign;

• application of the b-jet veto. This veto removes events where there is at least one b-tagged jet, in order to reduce t¯ t events in the signal region;

• the trasverse mass (m

T

) between the lepton and the missing transverse energy (MET ) must be less than 70 GeV. This cut is required in order to reduce the W+jets events in the signal region.

All the backgrounds can be classified into three major categories:

• events with true lepton and τ

had

signatures – Z/γ

, Dibosons, Top

• events where a jet fakes a τ

had

signature – QCD jets, W+jets, Z → ll+jets, Dibosons

• Events where a light charged lepton fakes a τ

had

signature – Z → ll + jets

Different control regions are built to select a chosen background inverting the requirement used

to remove these backgrounds from the signal region. A summary of the different control regions

is reported in the tab 1 .The control regions allow the validation of the selections performed to

separate signal and backgrounds.

(5)

Control region cuts

Z → ll Two leptons with same flavour Top enriched invert b-veto, m

T

> 40 GeV

W enriched m

T

> 70 GeV QCD enriched invert lepton isolation

Table 1: CR selections

1.3 Fake-Factor method to estimate jet faking hadronic taus

The background from jets faking taus is a dominant background for the lep-had channel. Con- sisting mostly of W+jets events, fake taus also come from QCD, t¯ t and Z+jets events. The jet faking a tau is different if the jet is quark-initiated or gluon-initiated. The quark-gluon frac- tion is different for different samples, but also dependent on the selection made. Therefore, it’s important to get fake estimate from regions as close as possible to the signal regions. In order to estimate jet faking tau background, the Fake-Factor is actually adopted as default method;

keeping all other selections the same, including the trigger selections and the requirement of the tau-candidate, the medium ID requirement is inverted, giving the definition for an anti-τ . It should be noted that a BDT score cut of 0.35 is applied to ensure that the anti-τ s are sufficiently signal-like for this method since at low BDT score the quark-gluon fraction changes significantly.

Events which contain a real-tau and a fake-lepton are not considered by this method. Further- more, events where taus are faked by electrons are also not considered by this method, as they are accounted for using a dedicated Z → ee tag-and-probe analysis performed by the TauWG. As result of this analysis, two strategies to reject electrons faking taus are provided, one based on a likelihood rejection and the other one based on BDT discriminant; scale factors and systematics are also available for both methods.

The fake estimate is determined in the following way: the number of anti-τ s are taken from data and events from Monte Carlo backgrounds such as Z → τ τ where the anti-tau is not matched to a truth jet are subtracted from this. This is then multiplied by a fake-factor, which is binned in τ transverse momentum and the number of tracks. From this, the estimate for each signal region can be obtained:

N

f akesSR

= (N

Dataanti−τ

− N

M C,not→tauanti−τ

) × F F (1) The combined fake-factor for each signal region is constructed as the sum of the individual fake- factors (F F

i

) for each relevant process i, weighted by it’s expected fraction of evetns in the anti-τ region (R

i

):

F F = R

W

F F

W

+ R

Z

F F

Z

+ R

T op

F F

T op

+ R

QCD

F F

QCD

(2) The individual fake-factors (F F

i

) are determined in dedicated control-regions for each process, separately for each analysis category. The definitions of the control regions is presented in table 1. It may be noted that these control regions are all defined to be very close to the corresponding signal region, inverting only some cuts to preserve the orthogonality respect to the signal region.

Each control region is then further split into a pass and fail region, depending on if the τ -

jets passed or failed the medium requirement. The signal contribution in the control regions is

assumed to be negligible, as is the QCD-multijet contamination in all CRs expect the one for

QCD. The individual fake-factor F F

i

is then obtained in the corresponding CR as the ratio of

data events that pass the tau ID requirement over those that fail it. Contributions from other

(6)

1 MEASUREMENT OF THE HIGGS BOSON PRODUCTION CROSS SECTION 4

processes that the one in question (denoted as not-i as well as events where the τ is not faked by a jet (denoted as notj → τ ) are each case subtracted from the data yield:

F F

i

= N

datapass,CR−i

− N

M C,not−ipass,CR−i

− N

M C,notj→τpass,CR−i

N

dataf ail,CR−i

− N

M C,not−if ail,CR−i

− N

M C,notj→τf ail,CR−i

(3) The expected fraction of events in the anti-τ region (R

i

) is determined from MC (except for R

QCD

which is obtained as R

QCD

= 1 − sumR

i

:

R

i

= N

i,M C,j→τf ail,SR

N

Dataf ail,SR

− N

,M C,notj→τf ail,SR

(4)

1.4 Signal Regions definition

To exploit signal-sensitive event topologies, two inclusive analysis categories are defined in an exclusive way. The VBF category targets events with a Higgs boson produced via vector boson fusion and it’s characterised by the presence of two high-p

T

jets with a large pseudorapidity separation. Although this category is dominated by VBF events, is also includes smaller contri- butions from ggF and VH production. The boosted category targets events with a boosted Higgs boson produced by the ggF mechanism. Higgs boson candidates are therefore required to have a large transverse momentum. These inclusive categories are further split into multiple signal regions to improve the sensitivity to Higgs boson prediction.

The VBF inclusive category is defined by the following requirements:

• leading jet p

T

> 40 GeV

• sub-leading jet p

T

30 GeV

• leading and sub-leading jets must be well separated; this is achieved requiring:

– jets ∆η > 3

– jets must be in opposite hemispheres

• visible mass of the jets m

jj

> 300 GeV

• the minimum jet η must be less than the light lepton or hadronic τ η, and the maximum jet η must be greater than the light lepton or hadronic τ η; this requirement is usually referred as centrality

• the MET must be greater than 20 GeV

• the |∆η| and ∆R| between the lepton and the τ are required to be less than 1.5 and 3.0, respectively

The VBF category is subsequently split into two further categories called VBF Tight and VBF Loose. The VBF Tight is defined applying the following cuts on top of the VBF region definition:

• the mass of the two jets must be greater than 500 GeV.

• the p

T

of the sum of the four-momenta of the lepton, τ and MET (usually referred as Higgs

p

T

, must be greater than 100 GeV)

(7)

• the p

T

of the τ must be greater than 30 GeV

For the VBF Loose categorisation, the event must pass the preselection and the VBF inclusive category, but fail the VBF Tight selections.

The ggH Boosted inclusive category is defined by the following requirements:

• events should fail VBF inclusive requirements

• the p

T

of the Higgs must be greater than 100 GeV

• the MET should be greater than 20 GeV

• the p

T

of the τ should be greater than 30 GeV

• the |∆η| and ∆R| between the lepton and the τ are required to be less than 1.5 and 2.5, respectively

The Boosted category is subsequently split into two further categories called Boosted High and Boosted Low. The Boosted High is defined applying the following cuts on top of the Boosted region definition:

• the p

T

of the Higgs must be greater than 140 GeV

• the ∆R between the lepton and the τ is required to be less than 1.5

For the Boost Low categorisation, the event must pass the preselection and the Boosted Inclusive category, but fail the Boost High selection.

1.5 Missing Mass Calculator retuning for Run2 analysis

A key variable of this analysis is the invariant mass of the taus arising from the Higgs decay. An accurate reconstruction of a resonance mass decaying into a pair of tau leptons is a difficult task because of the presence of multiple undetected neutrinos from the tau decays. The Missing Mass Calculator (MMC) is a sophisticated method to optimise the di-τ invariant mass reconstruction.

It is based on the requirement that mutual orientations of the neutrinos and other decay products

are consistent with the mass and decay kinematics of a tau lepton. This is achieved by minimizing

a likelihood function defined in the cinematically allowed phase space region. MMC was already

one of the most powerful tools used in SM-Higgs to tau tau searches in Run1 at LHC. During

Run2, many efforts need to be done to optimise the analysis tools to the new experimental

conditions. Among these tools, MMC requires to be retuned in order to continue play a key role

again in the searches of the Higgs boson in di-tau final states. During this year, I was involved

into the MMC retuning for the H → τ τ → lep − had channel. I have already performed such

a type of studies last year for H → τ τ → had − had channel and my previous results have

been fully confirmed by new similar studies. Results for the lep-had channel were presented in

the Mass Task Force bi-weekly meetings. The retuning procedure has been tested and the final

results are comparable with Run1 results. As final results of these studies, a new retuned version

of the MMC has been released and it’s currently used in the analysis.

(8)

2 CODE DEVELOPMENT FOR XTAUFRAMEWORK AND FITBOX 6

1.6 Fit Model

A maximum likelihood fit is performed to extract the parameter of interest (POI):

µ = σ

H

× BR(H → τ τ → lep − had) σ

H

× BR(H → τ τ → lep − had) 

SM

(5) The likelihood is maximized on the MMC distributions in all signal regions, using also informa- tion from control regions included to constrain background normalizations. The fitting model scheme is reported in fig. 2

Figure 2: Fitting model

Actually two types of fit are performed:

• Asimov fit, where the fit is performed on an Asimov dataset built substituting data both in signal regions and control regions with MC prediction.

• Hybrid fit, where the fit is performed using data in control regions and in the sidebands ( M M C < 100GeV and M M C > 150 GeV).

After the fit, much time is spent to scrutinize the fit results and searching for possible bugs and noy expected effects. To perform such a type of tasks, I was involved in the development and maintenance of a fitting tool called FitBox which is described in the next section.

1.7 Outlook and Plans for the Next 12 Months

The analysis team wants to get the unblinding in the next months using the full 2015+2016 datasets collected by ATLAS at √

s = 13 TeV. Various iterations with an Editorial Board are ongoing, and both a CONF note paper draft are available. So the aim is to push as much as possible to get this publication and possible continue the job in Higgs CP analyses constructed on top of the coupling analysis.

2 Code development for xTauFramework and FitBox

Beside the physics analysis, I was also involved in the development and maintenance of two

frameworks used by different sub-groups in the HLepton and TauCP groups:

(9)

• xTauFramework, a framework developed to produce flat ntuples performing event selection on DxAOD samples. As developer, some of my duties are:

– adding new tools provided by the different CP groups, giving feedback in case of errors or anomaluos results (e.g. fJVT, EgammaChargeCorrectionTool, etc ...);

– updating the framework to use the latest recommendations provided by the CP groups;

– adding variables and

– optimising and maintaining the H → τ τ → lephad specific code according the feed- back provide by the other analysers.

These tasks are giving me the opportunity to stay in contact both with the different CP groups and analysis groups like LFV analysis (which is sharing the same ntuples together with the SM analysis) or TauCP groups tag and probe analyses.

• FitBox, a tool to perform fit cross checks on a given workspace produced using HistFactory.

FitBox was born initially to provide NLL scans as additional material to the fit-cross checks performed by other tools. Subsequently, the tool started to be expanded adding all the necessary fit cross-checks to be a fully independent tool:

– significance calculation

– pulls and nuisance parameters ranking plots – pre-post fit fitting variables distributions – morphing template histograms

– NLL scans performed using different methods

Up to now, FitBox is now started to be regularly used by different groups/analysisers.

Apart from knowledge of RooStat/RooFit capabilities and their usage, the development of this tool gave/is giving me the opportunity to apply various concepts which I learned in the courses I joined in the first PhD year (profile likelihood fit, etc ...)

3 Teaching

List of teaching activities:

• Physik IV Ubung (Prof. A. Quadt) GAUSS requirements have been fulfilled.

4 Lectures

GAUSS requirements have been already fulfilled.

List of the attended seminar:

• Mitarbeiterseminar during Summer semester 2016

• Mitarbeiterseminar during Winter semester 2016/2017

(10)

5 GIVEN TALKS 8

5 Given Talks

List of talks given in local meetings, collaboration meetings, workshops and conferences:

Meeting Type of Talk Number of talks

AG Quadt/AG Lai Group Meet- ing Göttingen

PhD Status report 2

ATLAS-D physics meeting Analysis Talk 1

Terascale Meeting Analysis Talk 1

Tau Performance/Higgs to Lep- tons Workshop

Analysis Talk 1

DPG conference Analysis Talk 1

EB meeting Analysis Review 1

HLepton Meeting Lep-Had studies + general xTauFW updates

12

LepHad Internal meeting xTauFW channel code update 6 Higgs Group Meeting Göttingen Analysis updates 4

Mass Task Force meeting MMC studies 3

xTauFW meeting code development 2

Pixel week Pixel related studies 3

Pixel DQ Workshop DQMF studies 1

Data Quality Meeting Data Quality report (class 2 shift) 4 TauWG meeting xTauFW report + fit studies 3

6 Attended Schools and Conferences

The H → τ τ in the lep-had final state analysis was presented in:

• ATLAS-D physics meeting 2016, held in Heidelberg from 04/10/2016 to 07/10/2016 Octo- ber 2016. The analysis was presented in the "VBF/VBS Signatures" session, emphasizing the role of the VBF signal category in the final fit results;

• ATLAS Tau Performance/Higgs to Leptons Workshop, held in Sheffield from 24/10/2016 to 28/10/2016. The talk was presented in the "SM H → τ τ " session and it was mainly focused on various analysis improvements, like the addition of the TLT trigger, and τ related problems like the Tau Electron Veto rejection;

• 10

th

Annual Meeting of the Helmholtz Alliance "Physics at the Terascale", held in DESY Hamburg from 21/11/2016 to 23/11/2016. The talk was presented in the "Higgs" session, giving am overview of the analysis strategy;

• DPG Spring conference, held in Munster from 27/03/2017 to 31/03/2017. The analysis was presented in the "Higgs Boson Zerfall in Tau Leptonen" session, and the talk was mainly focused on the various fit cross-checks which need to be performed to make more robust results.

7 OTP Shifts

OTP Shifts have been done in the contest of the Pixel Detector:

• class 2 shift: Pixel offline DQ shift. This shift takes role of checking the data quality taken

with Pixel detector at the point of calibration loop of the data processing before bulk

processing. The results of quality checks in this shift is reflected to the bulk processing, so

that it takes important role in data reconstruction;

(11)

• class 2 shift: Pixel offline DQ expert. Given the experience in the Pixel Offline DQ shift, I was also qualified as Pixel offline DQ expert, whose main role is to supervise offline DQ shifter, answering his/her questions about problems during the shift. Moreover, Pixel offline DQ expert will sign off runs for the bulk processing, performing additional checks respect to the ones performed by the offline shifter;

• class 3 shift: Study of the effects of Pixel detector increasingly masking on egamma perfor- mance. The aim of this project is to check the impact of different layer masks combination on electron reconstruction. The project is done under the supervision of Yosuke Takubo and in close collaboration with Daiki Yamaguchi. The analysis strategy was discussed and fixed together, giving me the possibility to experiment new ideas and suggest further developments. The major steps of this project are:

– apply different Pixel detector masks in the Athena reconstruction algorithm. This allows to produce modified xAOD samples which can be analysed to extract results of the masking procedure;

– analysis of the samples produces in the step before. To do this, a xAOD analysis code as been developed and tested.

This class 3 shift is now ended and new class 3 shifts were already assigned to me; these are discussed in the next sub-section.

7.1 Future OTP activities

Here there is a list of OTP activities/tasks scheduled for the next months:

• class 1 shift: ID DCS watcher. This shift is performed in the ATLAS Control Room (ACR); it consists in monitoring the Detector Control Status (DCS), understanding and localising problems/errors and reporting them to the various Inner Detector experts in case of emergency. No specific training is required to perform this shift;

• class 1 shift: Online DQ Shifter in the ACR. This shift consists in DQ checks during the data taking to spot detector inconsistency/problems and take immediate action in order to avoid loosing data. To perform this task, three steps are needed:

– attend a dedicated training (already booked for May 2017);

– pass a test;

– book at least 2 shadow shifts (already booked) in the ACR in order to follow detector operation and get practical skills from already qualified shifters;

• class 3 shift : optimisation of Pixel DQ checks in the Data Quality Monitoring Framework (DQMF); this task was assigned to me given the past experience gained during the Qual- ification Task. Main target will be to detect and correct errors/inconsistencies in the DQ checks which can lead to problem underestimation/confusion during Online DQ shift. Cur- rent status of the task has been showed during the last Pixel week, held from 20/03/2017 to 23/03/2017 at CERN;

• class 3 shift: revision of the Online Histogram Presenter (OHP) for Online ID DQ shift;

OHP is one of the monitoring tools used in the ACR to show useful histograms to detect

data taking related problems. Past experience showed that shifters get lots of confusion due

(12)

7 OTP SHIFTS 10

Class 1 / Class 2 Class 3

Obligation 11.24 0.20

Done 15.98 0.26

Difference (Done-Obligatory) + 4.74 +0.06

Table 2: Previous Year, from 01/04/2016 to 01/01/2017 (Finished)

to missing alarms from histograms and due to large number of displayed histograms. The task will consist mainly in cleaning the display, adding missing histograms and propagating alarms from DQMF. Current status of the task has been showed during the last Pixel week, held from 20/03/2017 to 23/03/2017 at CERN.

Figure 3: Previous Year, from 01/04/2016 to 01/01/2017 (Finished)

(13)

Class 1 / Class 2 Class 3

Obligation 11.24 0.20

Done 3.93

Difference (Done-Obligatory)

Table 3: Present Year (Ongoing)

Figure 4: Present Year (Ongoing)

8 Further Activities

As further activities, during this year I was also involved in:

• collaborate with Paul Krug Costantin to perform Higgs CP studies for his bachelor thesis.

He investigated the CP nature of the Higgs Boson based on the spin correlation between the taus produced by Higgs decay. In this case the Higgs Boson can be parametrized by a scalar/pseudo-scalar Higgs mixing angle Φ

τ

, which can be determined in H → τ

+

τ

decay with subsequent τ -lepton decays to charged prongs. In order to define an observable φ

which is sensitive to Φ

τ

, two reconstruction method can be used: the ρ-decay plane method for τ

±

→ ρ

±

and the impact parameter method for all other major tau decay. Paul was able to re-implement both methods using his own code and perform useful validation checks at truth level respect to the results currently used by the H → τ τ CP group;

• supervising Brendan Marsch, an undergraduate student from the University of Missouri, who spent roughly 3 months in Gottingen during the past summer as part of the DAAD- RISE program. Here I’m reporting the abstract of the report he wrote as conclusion of his experience

A multivariate analysis is presented for the study of the vector boson fusion (VBF) Higgs

boson decaying to a pair of tau leptons. While the VBF production mechanism of the Higgs

is roughly an order of magnitude lower in cross section than the dominant gluon-gluon fu-

sion mechanism, it is shown that VBF produces a distinctive signature that is well suited

for detection by multivariate analyses. A number of discriminant variables are explored in

addition to a direct comparison of different machine learning toolkit. Ultimately, a statis-

tical significance of 7.9 is achieved for detection of the VBF Higgs boson in this truth level

study.

(14)

8 FURTHER ACTIVITIES 12

Riferimenti

Documenti correlati

Abstract Mutations in the SPG3A gene represent a sig- nificant cause of autosomal dominant hereditary spastic paraplegia with early onset and pure phenotype.. We describe an

The primary objectives of this study are to investigate how the construction of a new large navigable canal through tidal flats affects (i) the groundwater flow and quality of the

In this study, two isolates of Fusarium oxysporum, obtained from lettuce plants grown in the Netherlands, which showed symptoms of wilt, were characterized by combining the study

Teone ribadisce che un racconto deve essere verosimile nei fatti e nel linguaggio (καὶ ἁ,λῶς στοχάζεσθαι ,ροσήκει τοῦ ,ρέ,οντος τῷ τε ,ροσώ,ῳ

This article will focus on a yet unexplored aspect of Pirandello’s famous short story ‘La giara’ [The Oil Jar, 1909] − namely its structural similarity with

5.11 Confronto tra la dinamica della biomassa degli individui riproduttivi della popolazione di siluri simulata con Excel e quella ricavata con

Astrophysics Source Code Library (2016). An Early Warning System for Asteroid Impact. DOPHOT, a CCD photometry program: Description and tests. Pan-STARRS and PESSTO search for

The modified Children's Yale–Brown Obsessive Compulsive Scale for children with autism spectrum disorder (CYBOCS-ASD) includes a Symptom Checklist (behavior present or absent) and