• Non ci sono risultati.

2.3 The KLOE experiment

2.3.4 The data acquisition

The main goal of the KLOE Data Acquisition (DAQ) system [93] was to collect data at a maximum rate of 50 M b/s from the∼ 13000 channels of the DC, the ∼ 5000 ADC’s and TDC’s channels of the EMC and of the trigger system 2.

The DAQ system is required to be flexible and the integrity of the events during the acquisition has to be continuously checked online, with a dead time which is constant and is independent of event topologies.

The KLOE Data Acquisition system is based on two levels of high-speed data concentration for buffering data coming from the Front-End Electronics (FEE) connected to the detector and an online farm of CPU’s for recording events, as shown in Fig. 2.16.

The first level (L1) of DAQ is arranged in 10 chains (4 are dedicated to the acquisition of the EMC, 4 of the DC and 2 of the trigger). These chains are composed of up to six VME crates, each one with 16 slave boards and a Read Out Controller (ROCK) collecting information from the FEE via the AUX-bus, a custom protocol developed specifically for the KLOE DAQ. All the ROCK’s in a chain are connected to a controller manager (ROCKM) through a custom fast bus (C-bus).

In the second level (L2), the ROCKM’s build pieces of events produced by the FEE and tagged by a trigger number (sub-events), which are acquired in streams by VME processors, equipped with FDDI interfaces. Two software processes − the Collector and the Sender − running on the processors asyn-chronously, manage the read-out activity: the Collector accesses to ROCKM memory and pushes data frames belonging to different triggers in a FIFO-structured shared memory (circular buffer), while the Sender retrieves from the queue a given amount of sub-events and transmits it to the online farm

2 Given an average event size of 5 kb, this throughput corresponds to 10 kHz, constituted by φ decays, Bhabha and cosmic-ray events.

Figure 2.16 The architecture of the Data Acquisition system.

via a fast FDDI connection.

A Data Flow Control process (DFC) provides the addresses of the online farm CPU’s, guaranteeing that all sub-events with the same trigger number are sent to the same CPU online farm. The Receiver process is in charge of catching these sub-events and putting them in a circular buffer; subsequently the Builder process merges them together to build a whole event in a YBOS format [94].

Events are saved on tape and on disks by the Recorder process. Also another process− the Spy-Daemon − reads formatted events and writes them in a spy buffer to perform various monitoring and calibration tasks at a third level (L3) (e.g. Bhabha and γγ events are written on the l3bha buffer, cosmic-ray events are written on the l3cos buffer).

The uniformity and the stability of the acquired data and of the detec-tor performances are controlled continuously by a number of dedicated pro-cesses. Among the procedures implemented, the Trgmon exploits the pattern of the acquired information from the trigger chains to fastly check luminosity, data/background rates and other relevant quantities; histogram-servers and event-display also use the shared memory mechanism to fetch data.

Some procedures are performed run by run, like the measurements of the beam energies and (the Trkmon process using Bhabha’s) and the monitor of the general quantities of the experiment through dedicated histogram browsers, while other jobs are done periodically, like the drift chamber and the calorime-ter energy/time calibrations.

event classification

This chapter is dedicated to a general description of the official KLOE reconstruction and event classification. After a short overview of the data taking conditions and of the composition of the data set collected so far (Sect.

3.1), the data reconstruction procedure is described in its relevant steps (Sect.

3.2). The algorithms adopted for the event classification are then summarized in Sect. 3.3, with particular care to the case of φ→ K+K events. Finally, some relevant features of the official KLOE Monte Carlo are discussed in Sect.

3.4.

3.1 KLOE data taking

In 1999 KLOE has started the data acquisition, concluding its first run in September 2002. During these years the data taking has been uninterrupted except for a short period in which DAΦNE operated for the DEAR experiment and for several stops due to machine studies and improvements.

In the first period the instantaneous luminosity was about two orders of magnitude lower than the value projected for DAΦNE. Unfortunately, also the

51

Year Integrated Luminosity

1999 4 pb−1

2000 20 pb−1 2001 170 pb−1 2002 300 pb−1

Table 3.1 Integrated luminosity during years 1999-2002 of KLOE data taking.

contamination determined by various backgrounds has been very high: during years 2000-2001 the average trigger rate has been 2.5 kHz, and only∼ 250 Hz derived from e+e collisions, φ and Bhabha events, the remainder being due to cosmic rays and machine background.

Some development studies performed on DAΦNE during the running pe-riod have brought the peak luminosity up to ∼ 5 · 1031cm−2s−1 in 2001 and

∼ 8·1031cm−2s−1in 2002. As a result, the integrated luminosity has been con-tinuosly increasing during last years, as can be seen from Tab. 3.1, allowing to collect a total of ∼ 500 pb−1. The plot in Fig. 3.1 represents the increase in integrated luminosity during the year as a function of time.

Also the background conditions have been steadily improving in 2002 (by even more than a factor 4 with respect to the end of 2001). Further improve-ments in the luminosity are expected by the end of 2003 with the insertion of a new interaction region.

In Fig. 3.2 an example of the output of the online monitoring is shown for a normal day of KLOE data taking. In the middle plot the luminosity is shown as a function of time during the day: at each point, an estimate of the luminosity is provided with a relative statistical error of 4%. A higher accuracy in the estimate of the luminosity (of the order of 1%) is reached by fully reconstructing the selected Bhabha events.

Figure 3.1 Integrated luminosity in pb−1 versus the number of days of data taking for the years 2000, 2001 and 2002.