scholarly journals L0TP+: the Upgrade of the NA62 Level-0 Trigger Processor

2020 ◽  
Vol 245 ◽  
pp. 01017
Author(s):  
Roberto Ammendola ◽  
Andrea Biagioni ◽  
Andrea Ciardiello ◽  
Paolo Cretaro ◽  
Ottorino Frezza ◽  
...  

The L0TP+ initiative is aimed at the upgrade of the FPGA-based Level-0 Trigger Processor (L0TP) of the NA62 experiment at CERN for the post-LS2 data taking, which is expected to happen at 100% of design beam intensity, corresponding to about 3.3 × 1012 protons per pulse on the beryllium target used to produce the kaons beam. Although tests performed at the end of 2018 showed a substantial robustness of the L0TP system also at full beam intensity, there are several reasons to motivate such an upgrade: i) avoid FPGA platform obsolescence, ii) make room for improvements in the firmware design leveraging a more capable FPGA device, iii) add new functionalities, iv) support the 4 beam intensity increase foreseen in future experiment upgrades. We singled out the Xilinx Virtex UltraScale+ VCU118 development board as the ideal platform for the project. L0TP+ seamless integration into the current NA62 TDAQ system and exact matching of L0TP functionalities represent the main requirements and focus of the project; nevertheless, the final design will include additional features, such as a PCIe RDMA engine to enable processing on CPU and GPU accelerators, and the partial reconfiguration of trigger firmware starting from a high level language description (C/C++). The latter capability is enabled by modern High Level Synthesis (HLS) tools, but to what extent this methodology can be applied to perform complex tasks in the L0 trigger, with its stringent latency requirements and the limits imposed by single FPGA resources, is currently being investigated. As a test case for this scenario we considered the online reconstruction of the RICH detector rings on an HLS generated module, using a dedicated primitives data stream with PM hits IDs. Besides, the chosen platform supports the Virtex Ultrascale+ FPGA wide I/O capabilities, allowing for straightforward integration of primitive streams from additional sub-detectors in order to improve the performance of the trigger.

2014 ◽  
Vol 7 (2) ◽  
pp. 197-211
Author(s):  
James Crossley

Using the 400th anniversary of the King James Bible as a test case, this article illustrates some of the important ways in which the Bible is understood and consumed and how it has continued to survive in an age of neoliberalism and postmodernity. It is clear that instant recognition of the Bible-as-artefact, multiple repackaging and pithy biblical phrases, combined with a popular nationalism, provide distinctive strands of this understanding and survival. It is also clear that the KJV is seen as a key part of a proud English cultural heritage and tied in with traditions of democracy and tolerance, despite having next to nothing to do with either. Anything potentially problematic for Western liberal discourse (e.g. calling outsiders “dogs,” smashing babies heads against rocks, Hades-fire for the rich, killing heretics, using the Bible to convert and colonize, etc.) is effectively removed, or even encouraged to be removed, from such discussions of the KJV and the Bible in the public arena. In other words, this is a decaffeinated Bible that has been colonized by, and has adapted to, Western liberal capitalism.


Author(s):  
LE THANH HA ◽  
HOANG PHUONG DUNG ◽  
PHAM HONG CHUONG ◽  
TO TRUNG THANH

This paper investigates the effects of global economic sanctions (GESs) on global bank linkages (GBLs) by using 4,032 pairs of 66 countries during the 2001–2013 period. We use the structural gravity model combining with the rich database of the Global Sanction Data Base introduced by Felbermayr et al. [(2020). The global sanctions data base. European Economic Review, 129, 1–23]. Our empirical results show a negative association between the GESs and GBLs. The differential effects of GESs on the GBLs are conditional on the sanction types. Furthermore, the consequences of global sanctions become more severe for countries featuring higher information asymmetries, captured either by a high level of world uncertainty, an occurrence of crisis and shocks or by a weak institutional system. Our results are robust and reliable when we use an alternative measure of bank connections, and in the context of controlling the potential endogeneity of global sanction.


2018 ◽  
Vol 4 (1) ◽  
Author(s):  
Versanudin Hekmatyar ◽  
Fentiny Nugroho

Abstract: The objective of this study is to describe the pattern of land tenure and forms of livelihood diversification in rural area. By using qualitative approach, data was collected and presented descriptively. The results are as follows, first, land is an important production factors as capital and labor. Land in Kedungprimpen village is still closely linked to the livelihoods of its inhabitants. High level of dependence of the population on agricultural land is also closely related to the local community's view that underlies the social differentiation of the rich, ample and poor. Second, this fact further encourages households todeal with the crisis, undertake series of livelihood activities to meet their basic needs. The selection of diversified forms of livelihood is mainly based on rational reasons related to the types of resources that can be optimized. Generally, livelihood diversification in Kedungprimpen Village is on agricultural andnon-agricultural sectors. Agricultural sector includes land cultivation, sharecrop, rent, mortgage, and labor system. Non-agricultural sector includes trade, handicrafts production, stockbreeding, and carpentry.Keywords: pattern of land tenure, land tenure, land diversification, peasant


2020 ◽  
Vol 26 (2) ◽  
Author(s):  
B.D. Varpe

Phylloplane biodiversity and endophytic fungi is considered one of the rich origins of novel biological activity compounds and high-level structural variation on the leaf surface. Plant leaves surface is a diverse terrestrial ecosystem, including filamentous fungi. This study aims to study the isolation and enumeration of Sapindus mukorossi phylloplane and endophytic fungal diversity. The Sapindus mukorossi isolated 14 fungal species from 9 genera of phylloplane and endophytic fungi. Cladosporium herbarum, Penicillium expansum, Fusarium oxysporum, Fusarium sp., Alternaria alternate, Collectotrichum orbiculare, Torulla herbarium, Epicoccum nigrum and Candida sp. as a phylloplane fungi. Aspergillus niger, A. flavus, Epicoccum nigrum. Penicillum digitatum, Penicillum sp. were identified as endophytic fungi.


Author(s):  
JUAN CARLOS ESTEVA ◽  
ROBERT G. REYNOLDS

The goal of the Partial Metrics Project is the automatic acquisition of planning knowledge from target code modules in a program library. In the current prototype the system is given a target code module written in Ada as input, and the result is a sequence of generalized transformations that can be used to design a class of related modules. This is accomplished by embedding techniques from Artificial Intelligence into the traditional structure of a compiler. The compiler performs compilation in reverse, starting with detailed code and producing an abstract description of it. The principal task facing the compiler is to find a decomposition of the target code into a collection of syntactic components that are nearly decomposable. Here, nearly decomposable corresponds to the need for each code segment to be nearly independent syntactically from the others. The most independent segments are then the target of the code generalization process. This process can be described as a form of chunking and is implemented here in terms of explanation-based learning. The problem of producing nearly decomposable code components becomes difficult when target code module is not well structured. The task facing users of the system is to be able to identify well-structured code modules from a library of modules that are suitable for input to the system. In this paper we describe the use of inductive learning techniques, namely variations on Quinlan's ID3 system that are capable of producing a decision tree that can be used to conceptually distinguish between well poorly structured code. In order to accomplish that task a set of high-level concepts used by software engineers to characterize structurally understandable code were identified. Next, each of these concepts was operationalized in terms of code complexity metrics that can be easily calculated during the compilation process. These metrics are related to various aspects of the program structure including its coupling, cohesion, data structure, control structure, and documentation. Each candidate module was then described in terms of a collection of such metrics. Using a training set of positive and negative examples of well-structured modules, each described in terms of the appointed metrics, a decision tree was produced that was used to recognize other well-structured modules in terms of their metric properties. This approach was applied to modules from existing software libraries in a variety of domains such as database, editor, graphic, window, data processing, FFT and computer vision software. The results achieved by the system were then benchmarked against the performance of experienced programmers in terms of recognizing well structured code. In a test case involving 120 modules, the system was able to discriminate between poor and well-structured code 99% of the time as compared to an 80% average for the 52 programmers sampled. The results suggest that such an inductive system can serve as a practical mechanism for effectively identifying reusable code modules in terms of their structural properties.


2017 ◽  
Vol 12 (12) ◽  
pp. P12017-P12017 ◽  
Author(s):  
D. Aisa ◽  
G. Anzivino ◽  
M. Barbanera ◽  
M. Bizzarri ◽  
A. Bizzeti ◽  
...  
Keyword(s):  

Author(s):  
Grant W. Koroll ◽  
Dennis M. Bilinsky ◽  
Randall S. Swartz ◽  
Jeff W. Harding ◽  
Michael J. Rhodes ◽  
...  

Whiteshell Laboratories (WL) is a Nuclear Research and Test Establishment near Winnipeg, Canada, operated by AECL since the early 1960s and now under decommissioning. WL occupies approximately 4400 hectares of land and employed more than 1000 staff up to the late-1990s, when the closure decision was made. Nuclear facilities at WL included a research reactor, hot cell facilities and radiochemical laboratories. Programs carried out at the WL site included high level nuclear fuel waste management research, reactor safety research, nuclear materials research, accelerator technology, biophysics, and industrial radiation applications. In preparation for decommissioning, a comprehensive environmental assessment was successfully completed [1] and the Canadian Nuclear Safety Commission issued a six-year decommissioning licence for WL starting in 2003 — the first decommissioning licence issued for a Nuclear Research and Test Establishment in Canada. This paper describes the progress in this first six-year licence period. A significant development in 2006 was the establishment of the Nuclear Legacy Liabilities Program (NLLP), by the Government of Canada, to safely and cost effectively reduce, and eventually eliminate the nuclear legacy liabilities and associated risks, using sound waste management and environmental principles. The NLLP endorsed an accelerated approach to WL Decommissioning, which meant advancing the full decommissioning of buildings and facilities that had originally been planned to be decontaminated and prepared for storage-with-surveillance. As well the NLLP endorsed the construction of enabling facilities — facilities that employ modern waste handling and storage technology on a scale needed for full decommissioning of the large radiochemical laboratories and other nuclear facilities. The decommissioning work and the design and construction of enabling facilities are fully underway. Several redundant non-nuclear buildings have been removed and redundant nuclear facilities are being decontaminated and prepared for demolition. Along with decommissioning of redundant structures, site utilities are being decommissioned and reconfigured to reduce site operating costs. New waste handling and waste clearance facilities have been commissioned and a large shielded modular above ground storage (SMAGS) structure is in final design in preparation for construction in 2010. The eventual goal is full decommissioning of all facilities and infrastructure and removal of stored wastes from the site.


Author(s):  
Lihui Wang ◽  
Weiming Shen ◽  
Xiaoqian Li ◽  
Sherman Lang

The objective of this research is to develop methodology and framework for distributed shop floor planning, real-time monitoring, and remote device control supported by intelligent sensors. An intelligent sensor serves runtime data from bottom up to facilitate high-level decision-making. It assures that correct decisions are made in a timely manner, if compared with the best estimations of engineers. Being an adaptive system, a so-designed framework will improve the flexibility and dynamism of shop floor operations, and provide a seamless integration among process planning, resource scheduling, job execution, process monitoring, and device control. This paper presents principles of the methodology, details in architecture design, module interactions, information flow, and a proof-of-concept prototype implementation.


2020 ◽  
Vol 10 (4) ◽  
pp. 1377 ◽  
Author(s):  
Mattia Previtali ◽  
Raffaella Brumana ◽  
Chiara Stanga ◽  
Fabrizio Banfi

In recent years, many efforts have been invested in the cultural heritage digitization: surveying, modelling, diagnostic analysis and historic data collection. Nowadays, this effort is finalized in many cases towards historical building information modelling (HBIM). However, the architecture, engineering, construction and facility management (AEC-FM) domain is very fragmented and many experts operating with different data types and models are involved in HBIM projects. This prevents effective communication and sharing of the results not only among different professionals but also among different projects. Semantic web tools may significantly contribute in facilitating sharing, connection and integration of data provided in different domains and projects. The paper describes this aspect specifically focusing on managing the information and models acquired on the case of vaulted systems. Information is collected within a semantic based hub platform to perform cross correlation. Such functionality allows the reconstructing of the rich history of the construction techniques and skilled workers across Europe. To this purpose an ontology-based vaults database has been undertaken and an example of its implementation is presented. The developed ontology-based vaults database is a database that makes uses of a set of ontologies to effectively combine data and information from multiple heterogeneous sources. The defined ontologies provide a high-level schema of a data source and provides a vocabulary for user queries.


2020 ◽  
Vol 17 (3) ◽  
pp. 172988142092160
Author(s):  
Vinayak Jagtap ◽  
Shlok Agarwal ◽  
Ameya Wagh ◽  
Michael Gennert

Humanoid robotics is a complex and highly diverse field. Humanoid robots may have dozens of sensors and actuators that together realize complicated behaviors. Adding to the complexity is that each type of humanoid has unique application program interfaces, thus software written for one humanoid does not easily transport to others. This article introduces the transportable open-source application program interface and user interface for generic humanoids, a set of application program interfaces that simplifies the programming and operation of diverse humanoid robots. These application program interfaces allow for quick implementation of complex tasks and high-level controllers. Transportable open-source application program interface and user interface for generic humanoids has been developed for, and tested on, Boston Dynamics’ Atlas V5 and NASA’s Valkyrie R5 robots. It has proved successful for experiments on both robots in simulation and hardware, demonstrating the seamless integration of manipulation, perception, and task planning. To encourage the rapid adoption of transportable open-source application program interface and user interface for generic humanoids for education and research, the software is available as Docker images, which enable quick setup of multiuser simulation environments.


Sign in / Sign up

Export Citation Format

Share Document