scholarly journals The physics program of a high-luminosity asymmetric B Factory at SLAC

1989 ◽  
Author(s):  
1989 ◽  
Author(s):  
A. Eisner ◽  
M. Mandelkern ◽  
R. Morrison ◽  
M. Witherell ◽  
P. Burchat ◽  
...  

2001 ◽  
Vol 16 (supp01a) ◽  
pp. 425-427
Author(s):  
◽  
WENDY TAYLOR

In the spring of 2001, the upgraded Fermilab Tevatron will begin its collider physics run with [Formula: see text] collisions at [Formula: see text], where it is expected to deliver an integrated luminosity of 2 fb -1 in the first two years. The DØ detector is undergoing an extensive upgrade in order to take full advantage of the high luminosity running conditions. The upgraded detector's new silicon vertex detector, fiber tracker, and lepton trigger capabilities make a rich B physics program possible at DØ. This paper describes the prospects for several DØ B physics measurements, including CP violation in [Formula: see text] decays, Bs mixing and [Formula: see text] lifetime.


2018 ◽  
Vol 192 ◽  
pp. 00032 ◽  
Author(s):  
Rosamaria Venditti

The High-Luminosity Large Hadron Collider (HL-LHC) is a major upgrade of the LHC, expected to deliver an integrated luminosity of up to 3000/fb over one decade. The very high instantaneous luminosity will lead to about 200 proton-proton collisions per bunch crossing (pileup) superimposed to each event of interest, therefore providing extremely challenging experimental conditions. The scientific goals of the HL-LHC physics program include precise measurement of the properties of the recently discovered standard model Higgs boson and searches for beyond the standard model physics (heavy vector bosons, SUSY, dark matter and exotic long-lived signatures, to name a few). In this contribution we will present the strategy of the CMS experiment to investigate the feasibility of such search and quantify the increase of sensitivity in the HL-LHC scenario.


2006 ◽  
Author(s):  
J. Albert ◽  
S. Bettarini ◽  
M. Biagini ◽  
G. Bonneaud ◽  
Y. Cai ◽  
...  
Keyword(s):  

2019 ◽  
Vol 214 ◽  
pp. 03055
Author(s):  
David Lange ◽  
Kenneth Bloom ◽  
Tommaso Boccali ◽  
Oliver Gutsche ◽  
Eric Vaandering

The high-luminosity program has seen numerous extrapolations of its needed computing resources that each indicate the need for substantial changes if the desired HL-LHC physics program is to be supported within the current level of computing resource budgets. Drivers include large increases in event complexity (leading to increased processing time and analysis data size) and trigger rates needed (5-10 fold increases) for the HL-LHC program. The CMS experiment has recently undertaken an effort to merge the ideas behind short-term and long-term resource models in order to make easier and more reliable extrapolations to future needs. Near term computing resource estimation requirements depend on numerous parameters: LHC uptime and beam intensities; detector and online trigger performance; software performance; analysis data requirements; data access, management, and retention policies; site characteristics; and network performance. Longer term modeling is affected by the same characteristics, but with much larger uncertainties that must be considered to understand the most interesting handles for increasing the "physics per computing dollar" of the HL-LHC. In this presentation, we discuss the current status of long term modeling of the CMS computing resource needs for HL-LHC with emphasis on techniques for extrapolations, uncertainty quantification, and model results. We illustrate potential ways that high-luminosity CMS could accomplish its desired physics program within today's computing budgets.


Sign in / Sign up

Export Citation Format

Share Document