Computing Model
Recently Published Documents





Informatics ◽  
2021 ◽  
Vol 8 (4) ◽  
pp. 71
János Végh

Today’s computing is based on the classic paradigm proposed by John von Neumann, three-quarters of a century ago. That paradigm, however, was justified for (the timing relations of) vacuum tubes only. The technological development invalidated the classic paradigm (but not the model!). It led to catastrophic performance losses in computing systems, from the operating gate level to large networks, including the neuromorphic ones. The model is perfect, but the paradigm is applied outside of its range of validity. The classic paradigm is completed here by providing the “procedure” missing from the “First Draft” that enables computing science to work with cases where the transfer time is not negligible apart from the processing time. The paper reviews whether we can describe the implemented computing processes by using the accurate interpretation of the computing model, and whether we can explain the issues experienced in different fields of today’s computing by omitting the wrong omissions. Furthermore, it discusses some of the consequences of improper technological implementations, from shared media to parallelized operation, suggesting ideas on how computing performance could be improved to meet the growing societal demands.

2021 ◽  
Vol 21 (15&16) ◽  
pp. 1296-1306
Seyed Mousavi

Our computers today, from sophisticated servers to small smartphones, operate based on the same computing model, which requires running a sequence of discrete instructions, specified as an algorithm. This sequential computing paradigm has not yet led to a fast algorithm for an NP-complete problem despite numerous attempts over the past half a century. Unfortunately, even after the introduction of quantum mechanics to the world of computing, we still followed a similar sequential paradigm, which has not yet helped us obtain such an algorithm either. Here a completely different model of computing is proposed to replace the sequential paradigm of algorithms with inherent parallelism of physical processes. Using the proposed model, instead of writing algorithms to solve NP-complete problems, we construct physical systems whose equilibrium states correspond to the desired solutions and let them evolve to search for the solutions. The main requirements of the model are identified and quantum circuits are proposed for its potential implementation.

2022 ◽  
Vol 54 (8) ◽  
pp. 1-36
Jinglin Zou ◽  
Debiao He ◽  
Sherali Zeadally ◽  
Neeraj Kumar ◽  
Huaqun Wang ◽  

Cloud computing is a network model of on-demand access for sharing configurable computing resource pools. Compared with conventional service architectures, cloud computing introduces new security challenges in secure service management and control, privacy protection, data integrity protection in distributed databases, data backup, and synchronization. Blockchain can be leveraged to address these challenges, partly due to the underlying characteristics such as transparency, traceability, decentralization, security, immutability, and automation. We present a comprehensive survey of how blockchain is applied to provide security services in the cloud computing model and we analyze the research trends of blockchain-related techniques in current cloud computing models. During the reviewing, we also briefly investigate how cloud computing can affect blockchain, especially about the performance improvements that cloud computing can provide for the blockchain. Our contributions include the following: (i) summarizing the possible architectures and models of the integration of blockchain and cloud computing and the roles of cloud computing in blockchain; (ii) classifying and discussing recent, relevant works based on different blockchain-based security services in the cloud computing model; (iii) simply investigating what improvements cloud computing can provide for the blockchain; (iv) introducing the current development status of the industry/major cloud providers in the direction of combining cloud and blockchain; (v) analyzing the main barriers and challenges of integrated blockchain and cloud computing systems; and (vi) providing recommendations for future research and improvement on the integration of blockchain and cloud systems.

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Saurabh Kumar

PurposeDecision-making in human beings is affected by emotions and sentiments. The affective computing takes this into account, intending to tailor decision support to the emotional states of people. However, the representation and classification of emotions is a very challenging task. The study used customized methods of deep learning models to aid in the accurate classification of emotions and sentiments.Design/methodology/approachThe present study presents affective computing model using both text and image data. The text-based affective computing was conducted on four standard datasets using three deep learning customized models, namely LSTM, GRU and CNN. The study used four variants of deep learning including the LSTM model, LSTM model with GloVe embeddings, Bi-directional LSTM model and LSTM model with attention layer.FindingsThe result suggests that the proposed method outperforms the earlier methods. For image-based affective computing, the data was extracted from Instagram, and Facial emotion recognition was carried out using three deep learning models, namely CNN, transfer learning with VGG-19 model and transfer learning with ResNet-18 model. The results suggest that the proposed methods for both text and image can be used for affective computing and aid in decision-making.Originality/valueThe study used deep learning for affective computing. Earlier studies have used machine learning algorithms for affective computing. However, the present study uses deep learning for affective computing.

2021 ◽  
Vol 2021 ◽  
pp. 1-21
Guo Chen ◽  
Zhigui Liu ◽  
Guang Yu ◽  
Jianhong Liang

Multisensor data generalized fusion algorithm is a kind of symbolic computing model with multiple application objects based on sensor generalized integration. It is the theoretical basis of numerical fusion. This paper aims to comprehensively review the generalized fusion algorithms of multisensor data. Firstly, the development and definition of multisensor data fusion are analyzed and the definition of multisensor data generalized fusion is given. Secondly, the classification of multisensor data fusion is discussed, and the generalized integration structure of multisensor and its data acquisition and representation are given, abandoning the research characteristics of object oriented. Then, the principle and architecture of multisensor data fusion are analyzed, and a generalized multisensor data fusion model is presented based on the JDL model. Finally, according to the multisensor data generalized fusion architecture, some related theories and methods are reviewed, and the tensor-based multisensor heterogeneous data generalized fusion algorithm is proposed, and the future work is prospected.

2021 ◽  
Vol 27 (10) ◽  
pp. 542-549
G. Ch. Nabibayova ◽  

The article proposes an approach to the development of an electronic demographic decision support system using technologies of Data Warehouse (DW) and Interactive Analytical Processing OLAP. This makes it possible to conduct high-level demographic research and provide support to decision-makers in demographic sphere. The article notes that demography is an interdisciplinary field of research and is defined as a complex science. Each industry of demography has many indicators. A sample list of these indicators is presented. The main characteristics of the DW, which should be taken into account when developing its architecture, are stated. Among these characteristics, one can find the main defining characteristics of Big Data — volume, velocity, variety, veracity, variability, visualization, value etc. For a more rational and efficient use of a large amount of information, taking into account its constant increase, to ensure the speed of execution of requests for a given system, it is proposed to use a Bus of Interconnected Data Marts (DM) as an architecture of DW. One of the advantages of using DM is that their use assumes distributed parallel data processing. This architecture allows for much faster results generation. It is based on the MapReduce distributed computing model and the Hadoop project. In addition, to effectively use large amounts of data, it is also proposed to use OLAP operations such as roll-up and drill-down, as well as fuzzy set theory, based on the technique of computing with words. The article also shows the practical application of interconnected DM. An OLAP cube is built on the basis of these DM. OLAP operations provide the ability to view cubes in different slices and provide aggregate data.

Energies ◽  
2021 ◽  
Vol 14 (19) ◽  
pp. 6389
Tomasz Turek ◽  
Damian Dziembek ◽  
Marcin Hernes

An important trend in today’s economy is to reduce the carbon footprint of organizations, businesses and households. Modern technologies, including ICT solutions, contribute to the reduction of CO2 production. The article focuses on the potential of using the cloud computing model in managing a modern and intelligent city. Modern city offices have an extensive IT infrastructure. With the emergence of new services provided online by the offices, the server infrastructure is also developing. Server rooms together with additional devices are characterized by high demand for electricity. Along with this, significant amounts of CO2 are produced. An alternative is the use of Cloud Computing solutions, which contribute to a significant reduction of the carbon footprint. The paper analyses potential solutions that can be used in city offices. The benefits and positive impact on the environment were highlighted. The empirical research was conducted based on questionnaires received from city offices. The results obtained indicate that city halls contribute significantly to the production of CO2. Moving IT services and solutions in whole or in part to cloud computing should be considered as one of the important elements of managing a smart and green city.

2021 ◽  
Vol 11 (10) ◽  
pp. 1700-1706
Jing Yang ◽  
Zhixiang Yin ◽  
Zhen Tang ◽  
Xue Pang ◽  
Jianzhong Cui ◽  

DNA origami is a highly precise nanometer material based on DNA molecular. In the current study, we present a visual computing model of minimum spanning tree that combines advantages of DNA origami, hybridization chain reaction and nano-gold particles. Nano-gold particles were used to represent vertices and molecular beacons with fluorescent labels were used as anchor strands, which were fixed on origami substrate with staple strands according to the shape in graph. We then induced hybridization chain reaction using initiator strands and fuel strands. Lastly the problem was detected using fluorescence. The model provides a visualized calculation model of minimum spanning tree by using hybridization chain reaction and fluorescence labeling on origami bases. This model utilizes their advantages and demonstrates effectiveness of the model through case simulation. It also reduces computational complexity of the problem and improve the way of solution reading.

2021 ◽  
Vol 12 (05) ◽  
pp. 01-09
Jun QIN ◽  
Yanyan SONG ◽  

MapReduce is a distributed computing model for cloud computing to process massive data. It simplifies the writing of distributed parallel programs. For the fault-tolerant technology in the MapReduce programming model, tasks may be allocated to nodes with low reliability. It causes the task to be reexecuted, wasting time and resources. This paper proposes a reliability task scheduling strategy with a failure recovery mechanism, evaluates the trustworthiness of resource nodes in the cloud environment and builds a trustworthiness model. By using the simulation platform CloudSim, the stability of the task scheduling algorithm and scheduling model are verified in this paper.

Sign in / Sign up

Export Citation Format

Share Document