scholarly journals Measurement of hadron form factors at BESIII

2018 ◽  
Vol 192 ◽  
pp. 00023
Author(s):  
Christoph Florian Redmer

The BESIII experiment, operated at the BEPCII e+e- collider in Beijing, has acquired large data sets at center-of-mass energies between 2.0 GeV and 4.6 GeV. One of the key aspects of the physics program of the BESIII collaboration is to test the understanding of QCD at intermediate energies. Applying different experimental techniques, form factors of hadrons are measured. Among these are the pion form factor, as an important input to the (g - 2)μ puzzle, and the electro-magnetic form factors of nucleons and hyperons in the time-like regime. An overview of the recent results and some ongoing studies at BESIII is provided.

2019 ◽  
Vol 218 ◽  
pp. 03004
Author(s):  
Christoph Florian Redmer

The two-photon physics program of the BESIII Collaboration is mainly motivated by the need of new measurements of transition form factors as input for the Standard Model calculations of $ \mathop a\nolimits_\mu ^{HLbL} $, the hadronic light-by-light scattering contribution to the anomalous magentic moment of the muon. The large data sets acquired at BESIII allow to study the transition form factors of pseudoscalar mesons in the relevant region of momentum transfer for $ \mathop a\nolimits_\mu ^{HLbL} $. In this presentation the status of the respective measurements for π0,η and η’ mesons are discussed, and prospects for studies of multi-meson systems as well as of doubly-virtual transition form factors are given.


2018 ◽  
Vol 172 ◽  
pp. 03002
Author(s):  
Haiming HU

The measurements of hadronic form factors of three modes using the data samples collected with the BESIII detector at BEPCII collider are presented. The cross section of e+e- → p p̅ at 12 energies from 2232.4 to 3671.0 MeV are measured, the electromagnetic form factor is deduced, and the ratio |GE/GM| is extracted by fitting the polar angle distribution. The preliminary results about the form factors of e+e- → ∧c+ ⊼c- will also be described. The cross section of e+e- → π+ π-between effective center-of-mass energy 600 and 900 MeV is measured by the ISR return method using the data set with the integrated luminosity of 2.93 fb-1 taken at ψ(3773) peak, the pion form factor is extracted.


2020 ◽  
Vol 53 (5) ◽  
pp. 1387-1391
Author(s):  
Matt Thompson

Nanostructure characterization using small-angle scattering is often performed by iteratively fitting a scattering model to experimental data. These scattering models are usually derived in part from the form factors of the expected shapes of the particles. Most small-angle-scattering pattern-fitting software is well equipped with form factor libraries for high-symmetry models, yet there is more limited support for distortions to these ideals that are more typically found in nature. Here, a means of generalizing high-symmetry form factors to these lower-symmetry cases via linear transformations is introduced, significantly expanding the range of form factors available to researchers. These linear transformations are composed of a series of scaling, shear, rotation and inversion operations, enabling particle distortions to be understood in a straightforward and intuitive way. This approach is expected to be especially useful for in situ studies of nanostructure growth where anisotropic structures change continuously and large data sets must be analysed.


Author(s):  
Б.А. Абжалова ◽  
А.Е. Шахарова ◽  
B. Abzhalova ◽  
A. Shakharova

В статье исследуются ключевые аспекты информатизации органов внешнего государственного аудита в Республике Казахстан, которые оцениваются как на достаточно высоком уровне. Однако, автором отмечается, что анализ больших массивов данных не представляется возможным ввиду их хранения в различных источниках, в том числе в связи с низким качестовм предоставляемых данных, неточностью, устареванию и т.д. В силу огромного объема информации, подлежащей постоянному анализу в целях обеспечения быстроты и точности принимаемых решений, действенный государственный аудит не может существовать и развиваться без высокоэффективной системы управления, основанной на цифровых технологиях. В работе проанализированы основные результаты трансформации государственного аудита за 2015-2019 годы и определены основные направления по совершенствованию деятельности органов внешнего государственного аудита посредством применения современных цифровых технологий. Также автором сделаны выводы и предолжены пути решения множества проблем в области информатизации органов государственного аудита, в частности Счетного комитета РК. Для дальнейшей цифровой трансформации аудиторской деятельности предлагается повышение эффективности существующей информационной системы, а также создание качественно новой единой цифровой транзакционной среды посредством интеграции базы данных государственных органов. The article discusses the key aspects of informatization of external state audit bodies in the Republic of Kazakhstan, which are assessed as being at a fairly high level. However, the author notes that the analysis of large data sets is impossible due to their storage in various sources, including due to the poor quality of the data provided, inaccuracy, obsolescence, etc. Due to the huge amount of information that is subject to constant analysis in order to ensure the efficiency and accuracy of decisions made, an effective state audit cannot exist and develop without a highly effective management system based on digital technologies. The article analyzes the main results of the transformation of state audit for 2015-2019 and identifies the main directions for improving the activities of external state audit bodies based on the use of modern digital technologies. The author also draws conclusions and suggests ways to solve many problems in the field of informatization of state audit bodies, in particular the Accounts Committee of the Republic of Kazakhstan. For further digital transformation of audit activities, it is proposed to increase the efficiency of the existing information system, as well as to create a qualitatively new unified digital transaction environment by integrating the database of state bodies.


Author(s):  
John A. Hunt

Spectrum-imaging is a useful technique for comparing different processing methods on very large data sets which are identical for each method. This paper is concerned with comparing methods of electron energy-loss spectroscopy (EELS) quantitative analysis on the Al-Li system. The spectrum-image analyzed here was obtained from an Al-10at%Li foil aged to produce δ' precipitates that can span the foil thickness. Two 1024 channel EELS spectra offset in energy by 1 eV were recorded and stored at each pixel in the 80x80 spectrum-image (25 Mbytes). An energy range of 39-89eV (20 channels/eV) are represented. During processing the spectra are either subtracted to create an artifact corrected difference spectrum, or the energy offset is numerically removed and the spectra are added to create a normal spectrum. The spectrum-images are processed into 2D floating-point images using methods and software described in [1].


Author(s):  
Thomas W. Shattuck ◽  
James R. Anderson ◽  
Neil W. Tindale ◽  
Peter R. Buseck

Individual particle analysis involves the study of tens of thousands of particles using automated scanning electron microscopy and elemental analysis by energy-dispersive, x-ray emission spectroscopy (EDS). EDS produces large data sets that must be analyzed using multi-variate statistical techniques. A complete study uses cluster analysis, discriminant analysis, and factor or principal components analysis (PCA). The three techniques are used in the study of particles sampled during the FeLine cruise to the mid-Pacific ocean in the summer of 1990. The mid-Pacific aerosol provides information on long range particle transport, iron deposition, sea salt ageing, and halogen chemistry.Aerosol particle data sets suffer from a number of difficulties for pattern recognition using cluster analysis. There is a great disparity in the number of observations per cluster and the range of the variables in each cluster. The variables are not normally distributed, they are subject to considerable experimental error, and many values are zero, because of finite detection limits. Many of the clusters show considerable overlap, because of natural variability, agglomeration, and chemical reactivity.


Author(s):  
Mykhajlo Klymash ◽  
Olena Hordiichuk — Bublivska ◽  
Ihor Tchaikovskyi ◽  
Oksana Urikova

In this article investigated the features of processing large arrays of information for distributed systems. A method of singular data decomposition is used to reduce the amount of data processed, eliminating redundancy. Dependencies of com­putational efficiency on distributed systems were obtained using the MPI messa­ging protocol and MapReduce node interaction software model. Were analyzed the effici­ency of the application of each technology for the processing of different sizes of data: Non — distributed systems are inefficient for large volumes of information due to low computing performance. It is proposed to use distributed systems that use the method of singular data decomposition, which will reduce the amount of information processed. The study of systems using the MPI protocol and MapReduce model obtained the dependence of the duration calculations time on the number of processes, which testify to the expediency of using distributed computing when processing large data sets. It is also found that distributed systems using MapReduce model work much more efficiently than MPI, especially with large amounts of data. MPI makes it possible to perform calculations more efficiently for small amounts of information. When increased the data sets, advisable to use the Map Reduce model.


2018 ◽  
Vol 2018 (6) ◽  
pp. 38-39
Author(s):  
Austa Parker ◽  
Yan Qu ◽  
David Hokanson ◽  
Jeff Soller ◽  
Eric Dickenson ◽  
...  

Computers ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 47
Author(s):  
Fariha Iffath ◽  
A. S. M. Kayes ◽  
Md. Tahsin Rahman ◽  
Jannatul Ferdows ◽  
Mohammad Shamsul Arefin ◽  
...  

A programming contest generally involves the host presenting a set of logical and mathematical problems to the contestants. The contestants are required to write computer programs that are capable of solving these problems. An online judge system is used to automate the judging procedure of the programs that are submitted by the users. Online judges are systems designed for the reliable evaluation of the source codes submitted by the users. Traditional online judging platforms are not ideally suitable for programming labs, as they do not support partial scoring and efficient detection of plagiarized codes. When considering this fact, in this paper, we present an online judging framework that is capable of automatic scoring of codes by detecting plagiarized contents and the level of accuracy of codes efficiently. Our system performs the detection of plagiarism by detecting fingerprints of programs and using the fingerprints to compare them instead of using the whole file. We used winnowing to select fingerprints among k-gram hash values of a source code, which was generated by the Rabin–Karp Algorithm. The proposed system is compared with the existing online judging platforms to show the superiority in terms of time efficiency, correctness, and feature availability. In addition, we evaluated our system by using large data sets and comparing the run time with MOSS, which is the widely used plagiarism detection technique.


Sign in / Sign up

Export Citation Format

Share Document