Catálogo de publicaciones - libros

Compartir en
redes sociales


IFAE 2006: Incontri di Fisica delle Alte Energie Italian Meeting on High Energy Physics

Guido Montagna ; Oreste Nicrosini ; Valerio Vercesi (eds.)

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-88-470-0529-7

ISBN electrónico

978-88-470-0530-3

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Italia 2007

Cobertura temática

Tabla de contenidos

Gamma-ray Astronomy with full coverage experiments

Paola Salvini

Full coverage experiments in gamma astronomy mean detectors observing most of the shower particles by means of a nearly continuous active area: in this way one may be able to get a lower energy threshold respect to that typical of the EAS sampling arrays (which are detecting only a small percentage of the particles reaching the ground). The Full Coverage detectors typically cover an energy range from about a few hundreds of GeV up to tenths of TeV, partially overlapping the usual range of the Cherenkov technique. The main topics of interest in this Very High Energy (VHE) range are the study of supernova remnants, the search for Active Galactic Nuclei (AGNs) and the high energy emission from gamma ray bursts (GRBs).

- Parallel Session: Neutrinos and Cosmic Rays (E. Lisi and L. Patrizii, conveners) | Pp. 333-336

The Liquid Xenon calorimeter of the MEG experiment

Fabrizio Cei

In the Standard Model (SM) of electroweak interactions the Lepton Flavour Violating (LFV) processes are forbidden at all; however almost all SM extensions predict processes which do not conserve the lepton number. In particular the supersymmetric models predict branching ratios for LFV reactions at level of 10, which should be experimentally observable. The µ → process is one of the golden channels for the LFV observation [, ]. The present limit for the branching ratio (µ → )/(µ → ) is 1.2 × 10 [] and the aim of the MEG experiment at Paul Scherrer Institute (PSI) is to improve this limit by a factor ∼ 100. This challenging goal requires refined and innovative detection techniques in order to reach very high resolutions in the measurement of and energy, momentum and timing. One of the key elements is the Liquid Xenon (LXe) e.m. calorimeter, the main subject of this paper.

- Parallel Session: Detectors and New Technologies (A. Cardini, M. Michelotto and V. Rosso, conveners) | Pp. 339-343

The Silicon Vertex Trigger Upgrade at CDF

Alberto Annovi

Motivations, design, performance and upgrade of the CDF Silicon Vertex Trigger are presented. The system provides CDF with a powerful tool for online tracking with offline quality in order to enhance the reach on -physics and large -physics coupled to quarks.

- Parallel Session: Detectors and New Technologies (A. Cardini, M. Michelotto and V. Rosso, conveners) | Pp. 345-348

Monolithic Active Pixel Sensors in a 130 nm Triple Well CMOS Technology

V. Re; C. Andreoli; M. Manghisoni; E. Pozzati; L. Ratti; V. Speziali; G. Traversi; S. Bettarini; G. Calderini; R. Cenci; F. Forti; M. Giorgi; F. Morsani; N. Neri; G. Rizzo

The attention of several groups in the particle physics community has been drawn to monolithic active pixel sensors (MAPS) in CMOS technology as promising candidates for charge particle tracking at the future high luminosity colliders. Their working principle, based on the diffusion of minority carriers in a lightly doped, thin epitaxial layer, might be exploited in the fabrication of highly granular, light detectors possibly satisfying the resolution constraints set by next generation experiments at the International Linear Collider and at the Super -Factory [].

- Parallel Session: Detectors and New Technologies (A. Cardini, M. Michelotto and V. Rosso, conveners) | Pp. 349-352

The external scanning proton microprobe in Florence: set-up and examples of applications

Lorenzo Giuntini

During 2003, at the LABEC laboratory (labec@fi.infn.it), Firenze, Italy, a new 3MV tandem accelerator was installed, mainly for Accelerator Mass Spectrometry (AMS) and Ion Beam Analysis (IBA) measurements. Our system is equipped with a cesium sputtering ion source for AMS and two sources for IBA applications, i. e. a duoplasmatron and a second cesium sputtering; duoplasmatron is usually preferred for micro beam applications, because, working with this source, it is possible to deliver more intense beam on target. The ion beam is focussed by two electrostatic lenses EL1 and EL2 and mass/energy analyzed by a ∼ 90° dipole magnet; beam waist is some 0.4m after the exit port of this magnet. Here a remotely controlled aperture is installed, to allow beam intensity regulation, and a beam profile monitor BPM1, to control shape, dimensions and intensity of the beam as it is transmitted through the slits; it is also possible to measure the beam current after the slits with a Faraday cup (FC1). The beam is then focussed by the electrostatic lenses EL3 and EL4 just at the entrance of the accelerator, where BPM2 allows monitoring proper beam shaping and FC2 is used to measure the current. The beam is then focalized in the high voltage terminal, inside the stripping canal; as the accelerating column has a strong focussing action, at the entrance of the column, a pre-acceleration electrode is installed, so that, increasing the speed of the ions, the particles are less sensitive to the column focussing.

- Parallel Session: Detectors and New Technologies (A. Cardini, M. Michelotto and V. Rosso, conveners) | Pp. 353-356

Infrastructure of the ATLAS Event Filter

Andrea Negri

The ATLAS detector [] is a high energy physics experiment designed to exploit the full physics potential provided by the Large Hadron Collider (LHC), which will provide collisions at a centre-of-mass energy of 14TeV and a luminosity of 10 cm s. The corresponding 40 MHz bunch crossing rate, the huge amount of detector channels (∼ 10) and a storage data flux of about 300MB/s outline the challenge of the Trigger and Data Acquisition (TDAQ) system: select every second the most interesting hundred events out of millions. The TDAQ system is organized in three different trigger levels (Fig. 1). The first one (LVL1), realized in hardware by custom electronics, reduces the data rate from the 40 MHz collision rate to about 75 kHz. The High Level Triggers (LVL2 and Event Filter) [], implemented on two different commodity component farms, provide a further reduction factor of ∼ 10. The LVL2 operates on the full granularity data inside Regions of Interest (RoIs) identified by LVL1 and it reduces the event rate down to ∼ 2 kHz. For events accepted by LVL2, the full event data is assembled by the Event Builder system (EB) and shipped to the EF which performs the last selection (a rejection of a factor ∼ 10) with a latency of the order of a second.

- Parallel Session: Detectors and New Technologies (A. Cardini, M. Michelotto and V. Rosso, conveners) | Pp. 357-360

The CMS High-Level Trigger

Pietro Govoni

At the LHC, the Large Hadron Collider that will start operating the next year at CERN, proton beams will collide with energies of 7 TeV each. The beams will be composed by several bunches, that will collide in the interacting points with a frequency of about 40 MHz. The CMS experiment (Compact Muon Solenoid), which is one of the two multi-purpose experiments that will take data, will have to cope with such a high interaction rate and reduce it to the order of 100 Hz, that is the constraint given by the persistency on tape. This reducing rate of a factor of (10) of single events of 2MB size will be achieved in two separate stages: the level one trigger (L1) [], implemented by means of hardware devices, and the high-level trigger (HLT) [], implemented in a software way on a farm of commercial PCs.

- Parallel Session: Detectors and New Technologies (A. Cardini, M. Michelotto and V. Rosso, conveners) | Pp. 361-364

WLCG Service Challenges and Tiered architecture in the LHC era

Daniele Bonacorsi; Tiziana Ferrari

The implementation of the computing infrastructure for LHC experiments requires graduate and accurate verification of performance of hardware resources (CPU, storage, network), of the middleware services provided by Grid projects and of the LHC experiment-specific software applications. The realization of the Computing Models of the LHC experiments needs high reliability of resources and services, and the demonstration of their functionality, robustness and effective usability in addressing real experiment use-cases. The data management and workload management sector of the Computing Models of LHC experiments are tested both in experiment-specific “Data Challenges” and in more general “Service Challenges” of increasing complexity, the latter being driven by Worldwide LHC Computing Grid (WLCG). The objective is to improve the quality and reliability of the distributed Grid services to address the computing needs in the LHC era.

- Parallel Session: Detectors and New Technologies (A. Cardini, M. Michelotto and V. Rosso, conveners) | Pp. 365-368