scholarly journals Wiederverwertung ausgehobener Molasse basierend auf geologischer Untergrundmodellierung für den geplanten 100 km Teilchenbeschleuniger-Tunnel am CERN nahe Genf, Schweiz

2020 ◽  
Vol 165 (12) ◽  
pp. 631-638
Author(s):  
Maximilian Haas

ZusammenfassungDas CERN (Conseil Européen pour la Recherche Nucléaire bzw. European Laboratory for Particle Physics) ist eine weltweit führende internationale Forschungseinrichtung auf dem Gebiet der Hochenergie- und Teilchenphysik. Die Erforschung der grundlegenden Bausteine des Universums und ihrer Interaktionen lieferte in den vergangenen Jahrzehnten bahnbrechende Erkenntnisse, die im experimentellen Nachweis des Higgs-Boson im Juli 2012 gipfelten. Um die in diesem Zusammenhang erforschten Erkenntnisse weiter zu vertiefen und noch unbeantwortete Fragen nach dem Ursprung und der Funktion des Universums zu beantworten, hat eine internationale Gemeinschaft von über 150 Instituten weltweit am CERN eine Studie für ein Forschungsprogramm mit einer neuen, leistungsfähigeren Teilchenbeschleunigerinfrastruktur initiiert. Die Future Circular Collider (FCC) Studie schließt die dafür erforderlichen unterirdischen Tunnel, Kavernen und Schächte und die damit verbundenen Konstruktionen an der Oberfläche mit ein. Die Infrastruktur ist so ausgelegt, um im Zusammenschluss mit den bereits bestehenden Teilchenbeschleunigern am CERN (z. B. PSB, PS, SPS, LHC) zu funktionieren. Im Rahmen des Projekts wurden seit 2014 die ersten technischen Machbarkeitsstudien in den verschiedensten Gebieten, unter anderem Geologie und Konstruktion des Tunnels, der sich über ca. 100 km im teils westschweizerischen und teils französischen Molassebecken erstreckt, durchgeführt, sodass FCC nach derzeitigem Planungsstand um das Jahr 2040 in Betrieb gehen kann. Im Zuge dessen ist ein geologisches Untergrundmodell unerlässlich, um einen sicheren Bau unterirdischer Infrastruktur zu gewährleisten und die Baumethode auf die Geologie abzustimmen. Ein entscheidender Faktor neben dem geologischen Modell ist die Wiederverwertbarkeit des ausgehobenen Molasse-Materials mit einem Volumen von etwa 9 Mio. m3 sowohl aus technischer als auch rechtlicher, gesellschaftspolitischer und sozio-ökonomischer Sicht.Dieser Artikel soll einen Einblick in diese beiden Machbarkeitsstudien des FCC Projekts geben, sowie Ansätze der geologischen, petrophysikalischen, geotechnischen und mineralogisch-chemischen Analysen präsentieren, die zur Beantwortung der Wiederverwertung dienen und in weiterer Folge in das geologische Untergrundmodell einfließen werden.

Author(s):  
Alexandros Ioannidis-Pantopikos ◽  
Donat Agosti

In the landscape of general-purpose repositories, Zenodo was built at the European Laboratory for Particle Physics' (CERN) data center to facilitate the sharing and preservation of the long tail of research across all disciplines and scientific domains. Given Zenodo’s long tradition of making research artifacts FAIR (Findable, Accessible, Interoperable, and Reusable), there are still challenges in applying these principles effectively when serving the needs of specific research domains. Plazi’s biodiversity taxonomic literature processing pipeline liberates data from publications, making it FAIR via extensive metadata, the minting of a DataCite Digital Object Identifier (DOI), a licence and both human- and machine-readable output provided by Zenodo, and accessible via the Biodiversity Literature Repository community at Zenodo. The deposits (e.g., taxonomic treatments, figures) are an example of how local networks of information can be formally linked to explicit resources in a broader context of other platforms like GBIF (Global Biodiversity Information Facility). In the context of biodiversity taxonomic literature data workflows, a general-purpose repository’s traditional submission approach is not enough to preserve rich metadata and to capture highly interlinked objects, such as taxonomic treatments and digital specimens. As a prerequisite to serve these use cases and ensure that the artifacts remain FAIR, Zenodo introduced the concept of custom metadata, which allows enhancing submissions such as figures or taxonomic treatments (see as an example the treatment of Eurygyrus peloponnesius) with custom keywords, based on terms from common biodiversity vocabularies like Darwin Core and Audubon Core and with an explicit link to the respective vocabulary term. The aforementioned pipelines and features are designed to be served first and foremost using public Representational State Transfer Application Programming Interfaces (REST APIs) and open web technologies like webhooks. This approach allows researchers and platforms to integrate existing and new automated workflows into Zenodo and thus empowers research communities to create self-sustained cross-platform ecosystems. The BiCIKL project (Biodiversity Community Integrated Knowledge Library) exemplifies how repositories and tools can become building blocks for broader adoption of the FAIR principles. Starting with the above literature processing pipeline, the concepts of and resulting FAIR data, with a focus on the custom metadata used to enhance the deposits, will be explained.


2019 ◽  
Vol 206 ◽  
pp. 08001
Author(s):  
Tadeusz Lesiak

A future giant electron-positron collider, operating at the energy frontier, is a natural proposal in order to push particle physics into new regime of precise measurements, in particular in the sectors of electroweak observables and Higgs boson parameters. The four projects of such accelerators: two linear (ILC and CLIC) and two circular (FCC and CEPC) are currently in various stages of development. In view of the update of European HEP strategy for particle physics and expectations of important decisions from Japan, China and USA, the next few years will be critical as far as the decisions about the construction of such colliders are concerned. The paper concisely reviews the relevant aspects and challenges of the proposed accelerators and detectors along with the presumed schedules of construction and operation. The motivation and very attractive physics program for new e+e− colliders, spanning in particular perspectives in Higgs, electroweak, and neutrino sectors, together with expectations of searches for New Physics, will be discussed as well.


Symmetry ◽  
2020 ◽  
Vol 12 (5) ◽  
pp. 700
Author(s):  
Marina Prvan ◽  
Arijana Burazin Mišura ◽  
Zoltan Gecse ◽  
Julije Ožegović

This paper deals with a problem the packing polyhex clusters in a regular hexagonal container. It is a common problem in many applications with various cluster shapes used, but symmetric polyhex is the most useful in engineering due to its geometrical properties. Hence, we concentrate on mathematical modeling in such an application, where using the “bee” tetrahex is chosen for the new Compact Muon Solenoid (CMS) design upgrade, which is one of four detectors used in Large Hadron Collider (LHC) experiment at European Laboratory for Particle Physics (CERN). We start from the existing hexagonal containers with hexagonal cells packed inside, and uniform clustering applied. We compare the center-aligned (CA) and vertex-aligned (VA) models, analyzing cluster rotations providing the increased packing efficiency. We formally describe the geometrical properties of clustering approaches and show that cluster sharing is inevitable at the container border with uniform clustering. In addition, we propose a new vertex-aligned model decreasing the number of shared clusters in the uniform scenario, but with a smaller number of clusters contained inside the container. Also, we describe a non-uniform tetrahex cluster packing scheme in the proposed container model. With the proposed cluster packing solution, it is accomplished that all clusters are contained inside the container region. Since cluster-sharing is completely avoided at the container border, the maximal packing efficiency is obtained compared to the existing models.


2014 ◽  
Vol 03 (02) ◽  
pp. 23-24
Author(s):  

A team of physicists from Hong Kong has now formally joined one of the most prestigious physics experiments in the world. Following a unanimous vote of approval today by its Collaboration Board, ATLAS has admitted the Hong Kong team as a member. The ATLAS Collaboration operates one of the largest particle detectors in the world, located at the Large Hadron Collider (LHC), the world's highest energy particle accelerator at CERN, Switzerland. In 2012, the ATLAS team — along with the CMS Collaboration — co-discovered the Higgs boson, or so-called 'God Particle'. The gigantic but sensitive and precise ATLAS detector, together with the unprecedentedly high collision energy and luminosity of the LHC, make it possible to search for fundamentally new physics, such as dark matter, hidden extra dimensions, and supersymmetry — a proposed symmetry among elementary particles. The LHC is currently undergoing an upgrade, targeting a substantial increase in beam energy and intensity in a year's time. It is widely expected that the discovery of the Higgs boson is only the beginning of an era of new breakthroughs in fundamental physics. All these exciting opportunities are now opened up to scientists and students from Hong Kong.


2020 ◽  
Vol 18 ◽  
pp. 110-142
Author(s):  
Abdeljalil Habjia

In the context of particle physics, within the ATLAS and CMS experiments at large hadron collider (LHC), this work presents the discussion of the discovery of a particle compatible with the Higgs boson by the combination of several decay channels, with a mass of the order of 125.5 GeV. With increased statistics, that is the full set of data collected by the ATLAS and CMS experiments at LHC ( s1/2 = 7GeV and s1/2 = 8GeV ), the particle is also discovered individually in the channel h-->γγ with an observed significance of 5.2σ and 4.7σ, respectively. The analysis dedicated to the measurement of the mass mh and signal strength μ which is defined as the ratio of σ(pp --> h) X Br(h-->X) normalized to its Standard Model where X = WW*; ZZ*; γγ ; gg; ff. The combined results in h-->γγ channel gave the measurements: mh = 125:36 ± 0:37Gev, (μ = 1:17 ± 0:3) and the constraint on the width Γ(h) of the Higgs decay of 4.07 MeV at 95%CL. The spin study rejects the hypothesis of spin 2 at 99 %CL. The odd parity (spin parity 0- state) is excluded at more than 98%CL. Within the theoretical and experimental uncertainties accessible at the time of the analysis, all results: channels showing the excess with respect to the background-only hypothesis, measured mass and signal strength, couplings, quantum numbers (JPC), production modes, total and differential cross-sections, are compatible with the Standard Model Higgs boson at 95%CL. Although the Standard Model is one of the theories that have experienced the greatest number of successes to date, it is imperfect. The inability of this model to describe certain phenomena seems to suggest that it is only an approximation of a more general theory. Models beyond the Standard Model, such as 2HDM, MSSM or NMSSM, can compensate some of its limitations and postulate the existence of additional Higgs bosons.


Author(s):  
Jean Zinn-Justin

Chapter 12 describes the main steps in the construction of the electroweak component of the Standard Model of particle physics. The classical Abelian Landau–Ginzburg–Higgs mechanism is recalled, first introduced in the macroscopic description of a superconductor in a magnetic field. It is based on a combination of spontaneous symmetry breaking and gauge invariance. It can be generalized to non–Abelian gauge theories, quantized and renormalized. The recent discovery of the predicted Higgs boson has been the last confirmation of the validity of the model. Some aspects of the Higgs model and its renormalization group (RG) properties are illustrated by simplified models, a self–interacting Higgs model with the triviality issue, and the Gross–Neveu–Yukawa model with discrete chiral symmetry, which illustrates spontaneous fermion mass generation and possible RG flows.


2006 ◽  
Vol 37 (1) ◽  
pp. 67-80 ◽  
Author(s):  
Pierre Bonnal ◽  
Jurgen De Jonghe ◽  
John Ferguson

The Large Hadron Collider (LHC) is under construction at CERN, the European Laboratory for Particle Physics, near Geneva, Switzerland. In 2003, a new earned value management (EVM) system was introduced to improve transparency in LHC project reporting, to allow a clearer distinction between cost differences to the baseline due to overruns versus resulting delays, and to provide the project management team with a more reactive project management information system for better decision-making. EVM has become a de facto standard for the follow-up of cost and schedule and several commercial packages are offered for implementing an EVM system. But because none of these packages fulfilled CERN's requirements, its executive management decided to proceed with an in-house development. In this paper, an overview of what CERN considers to be good requirements for an EVM system suited to large-scale projects is provided: the deliverable-oriented, collaborative and lean management dimensions are enforced. In conclusion, we discuss some of our positive and negative experiences so those who would like to develop or implement similar enterprise-wide project control systems can be more aware of common pitfalls.


Corpora ◽  
2018 ◽  
Vol 13 (2) ◽  
pp. 169-203 ◽  
Author(s):  
Ersilia Incelli

This study explores the scientific popularisation process and how science knowledge is recontextualised and rewritten in the transfer from one context or genre to another genre. It does this through a case study of the discovery of the Higgs boson, a new physics particle that is commonly known as the God Particle, and by focussing on the meta-discursive strategies that emerged from the texts after corpus-assisted analysis. Extensive use was made of exemplification and generalisation through analogies and metaphors, and through ideational content representing epistemic uncertainty in the newspaper discourse. Prominence is given to science popularisation in the British press, because online newspapers capture a wide non-expert public, the aim being to offer an analysis of how the discursive perspective of complex science news (particle physics) is conveyed to the general public, and can allow a systematic investigation into ‘how’ a scientific event is constructed and made newsworthy. Two corpora, consisting of texts from the scientific journal, Physics Letters B, and from online media blogs, were also compiled for contrastive purposes. In this way, prominent lexico-semantic textual properties are identified in the main corpus (containing newspapers) through standard corpus linguistic techniques, in particular through key semantic domain annotation, leading to more insight into how complex science is linguistically constructed and conveyed to a lay audience.


Sign in / Sign up

Export Citation Format

Share Document