computing performance
Recently Published Documents


TOTAL DOCUMENTS

238
(FIVE YEARS 98)

H-INDEX

11
(FIVE YEARS 4)

2022 ◽  
Author(s):  
David Rutishauser ◽  
Gavin Mendeck ◽  
Ray Ramadorai ◽  
John Prothro ◽  
Thadeus Fleming ◽  
...  

2021 ◽  
Vol 23 (2) ◽  
pp. 32-39
Author(s):  
Iulia Crișan ◽  
Florin Alin Sava ◽  
Laurențiu Paul Maricuțoiu

Objective: Two experimental studies were conducted to compare the ability of immediate and delayed recall indicators to discriminate between performances of simulators and full-effort clinical and nonclinical participants. Methods: Three groups of simulators (uncoached, symptom-coached, and testcoached), one group of community controls, and one group of cognitively impaired patients were assessed with four experimental memory tests, in which the immediate and delayed recall tasks were separated by three other tasks. Results: Across both studies, delayed recall demonstrated higher accuracy than immediate recall in classifying simulated performances as invalid, as compared to performances of bona fide clinical participants. ROC curve results showed sensitivities below 50% for both indicators at specificities of ≥ 90%. Computing performance curves across recall trials revealed descending trends for all three simulator groups indicating a suppressed learning effect as a marker of noncredible performances. Among types of coaching, test-coaching proved to decrease differences between simulators and patients. Discussion: The effectiveness of such indicators in clinical evaluations and their vulnerability to information about test-taking strategies are discussed.


2021 ◽  
Author(s):  
Raihan Ur Rasool ◽  
Hafiz Farooq Ahmad ◽  
Wajid Rafique ◽  
Adnan Qayyum ◽  
Junaid Qadir

<p>Quantum computing is an emerging field of research that can provide a “quantum leap” in terms of computing performance and thereby enable many new exciting healthcare applications such as rapid DNA sequencing, drug research and discovery, personalized medicine, molecular simulations, diagnosis assistance, efficient radiotherapy. In this paper, we provide a taxonomy of existing literature on quantum healthcare systems and identify the key requirements of quantum computing implementations in the healthcare paradigm. We also provide a through exploration of the application areas where quantum computing could transform traditional healthcare systems. Finally, we perform an extensive study of quantum cryptography from the perspective of healthcare systems to identify security vulnerabilities in traditional cryptography systems.</p>


2021 ◽  
Author(s):  
Feng Deng ◽  
Zhong Su ◽  
Rui Wang ◽  
Jun Liu ◽  
Yanzhi Wang

Most of the existing infrared imaging systems employ the scheme of FPGA/FPGA+DSP with numerous peripheral circuits, which leads to complex hardware architecture, limited system versatility, and low computing performance. It has become an intriguing technical problem worldwide to simplify the system structure while improving the imaging performance. In this paper, we present a novel real-time infrared imaging system based on the Rockchip’s RV1108 visual processing SoC (system on chip). Moreover, to address the problem of low contrast and dim details in infrared images with a high dynamic range, an adaptive contrast enhancement method based on bilateral filter is proposed and implemented on the system. First, the infrared image is divided into a base layer and a detail layer through bilateral filter, then the base layer is compressed by an adaptive bi-plateau histogram equalization algorithm, and finally a linear-weighted method is used to integrate the detail layer to obtain the image with enhanced details. The experimental results indicate that compared with traditional algorithms, our method can effectively improve the overall contrast of the image, while effectively retaining the image details without noise magnification. For an image of 320*240 pixels, the real-time processing rate of the system is 68 frames/s. The system has the characteristics of simplified structure, perceptive image details, and high computing performance.


2021 ◽  
Author(s):  
Adnan Qayyum

<p>Quantum computing is an emerging field of research that can provide a “quantum leap” in terms of computing performance and thereby enable many new exciting healthcare applications such as rapid DNA sequencing, drug research and discovery, personalized medicine, molecular simulations, diagnosis assistance, efficient radiotherapy. In this paper, we provide a taxonomy of existing literature on quantum healthcare systems and identify the key requirements of quantum computing implementations in the healthcare paradigm. We also provide a through exploration of the application areas where quantum computing could transform traditional healthcare systems. Finally, we perform an extensive study of quantum cryptography from the perspective of healthcare systems to identify security vulnerabilities in traditional cryptography systems.</p>


2021 ◽  
Author(s):  
Adnan Qayyum

<p>Quantum computing is an emerging field of research that can provide a “quantum leap” in terms of computing performance and thereby enable many new exciting healthcare applications such as rapid DNA sequencing, drug research and discovery, personalized medicine, molecular simulations, diagnosis assistance, efficient radiotherapy. In this paper, we provide a taxonomy of existing literature on quantum healthcare systems and identify the key requirements of quantum computing implementations in the healthcare paradigm. We also provide a through exploration of the application areas where quantum computing could transform traditional healthcare systems. Finally, we perform an extensive study of quantum cryptography from the perspective of healthcare systems to identify security vulnerabilities in traditional cryptography systems.</p>


2021 ◽  
Vol 21 (2) ◽  
pp. 234-246
Author(s):  
M.A. Padalko ◽  
◽  
Yu.A. Shevchenko ◽  
◽  
◽  
...  

An algorithm for parallel exact calculation of the ground state of a two-dimensional Edwards–Anderson model with free boundary conditions is given. The running time of the algorithm grows exponentially as the side of the lattice square increases. If one side of the lattice is fixed, the running time grows polynomially with increasing size of the other side. The method may find application in the theory of spin glasses, in the field of quantum computing. Performance data for the bimodal distribution is given. The distribution of spin bonds can be either bimodal or Gaussian. The method makes it possible to compute systems up to a size of 40x40.


Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1620
Author(s):  
Airton Borin ◽  
Anne Humeau-Heurtier ◽  
Luiz Virgílio Silva ◽  
Luiz Murta

Multiscale entropy (MSE) analysis is a fundamental approach to access the complexity of a time series by estimating its information creation over a range of temporal scales. However, MSE may not be accurate or valid for short time series. This is why previous studies applied different kinds of algorithm derivations to short-term time series. However, no study has systematically analyzed and compared their reliabilities. This study compares the MSE algorithm variations adapted to short time series on both human and rat heart rate variability (HRV) time series using long-term MSE as reference. The most used variations of MSE are studied: composite MSE (CMSE), refined composite MSE (RCMSE), modified MSE (MMSE), and their fuzzy versions. We also analyze the errors in MSE estimations for a range of incorporated fuzzy exponents. The results show that fuzzy MSE versions—as a function of time series length—present minimal errors compared to the non-fuzzy algorithms. The traditional multiscale entropy algorithm with fuzzy counting (MFE) has similar accuracy to alternative algorithms with better computing performance. For the best accuracy, the findings suggest different fuzzy exponents according to the time series length.


2021 ◽  
Vol 5 (9 (113)) ◽  
pp. 17-29
Author(s):  
Andrii Sahun ◽  
Vladyslav Khaidurov ◽  
Valeriy Lakhno ◽  
Ivan Opirskyy ◽  
Vitalii Chubaievskyi ◽  
...  

This paper analyzes ways to improve the cryptographic strength of the symmetric block cipher RC5. The task to enhance the stability of the classic RC5 cipher is explained by the fact that it is part of various open cryptographic libraries and is frequently used in practice. Several methods have been considered, applying which theoretically contributes to improving the stability of cryptographic transformations. It is found that unlike other alternatives (increasing the number of rounds, the length of the key, and the encryption block), the use of nonlinear shift functions does not increase the computational complexity of the RC5 algorithm. The study result has helped build an analytical model that was implemented in the form of the MATLAB (USA) software application. The software interface provides the ability to manually change the encryption parameters of the RC5 crypto algorithm. The resulting upgrade of the RC5 crypto algorithm has been tested on different sets of input data during encryption and decryption. The resulting modification also does not lead to an increase in the calculation time but makes it possible to improve the resistance to hacking the encrypted data by several orders of magnitude (210), provided that differential analysis methods are used and the number of rounds is 14. For one of the nonlinear functions used, resistance to the differential cryptoanalysis used increased by 212 times already in the eleventh round of encryption. The reliability of the improved cryptosystem has been confirmed by the absence of statistical correlation between the blocks of incoming messages and output blocks, the absence of collisions at which it is possible to obtain the same sequences of bits at the output with different messages at the input. The resulting algorithm could be applied in computer systems with low computing performance


Informatics ◽  
2021 ◽  
Vol 8 (4) ◽  
pp. 71
Author(s):  
János Végh

Today’s computing is based on the classic paradigm proposed by John von Neumann, three-quarters of a century ago. That paradigm, however, was justified for (the timing relations of) vacuum tubes only. The technological development invalidated the classic paradigm (but not the model!). It led to catastrophic performance losses in computing systems, from the operating gate level to large networks, including the neuromorphic ones. The model is perfect, but the paradigm is applied outside of its range of validity. The classic paradigm is completed here by providing the “procedure” missing from the “First Draft” that enables computing science to work with cases where the transfer time is not negligible apart from the processing time. The paper reviews whether we can describe the implemented computing processes by using the accurate interpretation of the computing model, and whether we can explain the issues experienced in different fields of today’s computing by omitting the wrong omissions. Furthermore, it discusses some of the consequences of improper technological implementations, from shared media to parallelized operation, suggesting ideas on how computing performance could be improved to meet the growing societal demands.


Sign in / Sign up

Export Citation Format

Share Document