scholarly journals Latency and Reliability Trade-off with Computational Complexity Constraints: OS Decoders and Generalizations

Author(s):  
Hasan Basri Celebi ◽  
Antonios Pitarokoilis ◽  
Mikael Skoglund
Mathematics ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 957
Author(s):  
Branislav Popović ◽  
Lenka Cepova ◽  
Robert Cep ◽  
Marko Janev ◽  
Lidija Krstanović

In this work, we deliver a novel measure of similarity between Gaussian mixture models (GMMs) by neighborhood preserving embedding (NPE) of the parameter space, that projects components of GMMs, which by our assumption lie close to lower dimensional manifold. By doing so, we obtain a transformation from the original high-dimensional parameter space, into a much lower-dimensional resulting parameter space. Therefore, resolving the distance between two GMMs is reduced to (taking the account of the corresponding weights) calculating the distance between sets of lower-dimensional Euclidean vectors. Much better trade-off between the recognition accuracy and the computational complexity is achieved in comparison to measures utilizing distances between Gaussian components evaluated in the original parameter space. The proposed measure is much more efficient in machine learning tasks that operate on large data sets, as in such tasks, the required number of overall Gaussian components is always large. Artificial, as well as real-world experiments are conducted, showing much better trade-off between recognition accuracy and computational complexity of the proposed measure, in comparison to all baseline measures of similarity between GMMs tested in this paper.


Author(s):  
YAODONG NI ◽  
QIAONI SHI

In this paper, we study the problem of targeting a set of individuals to trigger a cascade of influence in a social network such that the influence diffuses to all individuals with the minimum time, given that the cost of initially influencing each individual is with randomness and that the budget available is limited. We adopt the incremental chance model to characterize the diffusion of influence, and propose three stochastic programming models that correspond to three different decision criteria respectively. A modified greedy algorithm is presented to solve the proposed models, which can flexibly trade off between solution performance and computational complexity. Experiments are performed on random graphs, by which we show that the algorithm we present is effective.


Energies ◽  
2021 ◽  
Vol 15 (1) ◽  
pp. 66
Author(s):  
Tatiano Busatto ◽  
Sarah K. Rönnberg ◽  
Math H. J. Bollen

Harmonic modeling of low-voltage networks with many devices requires simple but accurate models. This paper investigates the advantages and drawbacks of such models to predict the current harmonics created by single-phase full-bridge rectifiers. An overview is given of the methods, limiting the focus to harmonic analysis. The error of each method, compared to an accurate numerical simulation model, is quantified in frequency and time domain considering realistic input scenarios, including background voltage distortion and different system impedances. The results of the comparison are used to discuss the applicability of the models depending on the harmonic studies scale and the required level of detail. It is concluded that all models have their applicability, but also limitations. From the simplest and fastest model, which does not require a numerical solution, to the more accurate one that allows discontinuous conduction mode to be included, the trade-off involves accuracy and computational complexity.


Author(s):  
K. Ulrich ◽  
W. Seering

Abstract Conceptual design is the initial stage of the transformation between functional and structural descriptions of devices. Our primary aim is to develop ideas that will allow computer tools for conceptual design to be built. In this paper, we point out a fundamental trade-off between the expressiveness of design languages and the computational complexity of the resulting design problem. Research in computational conceptual design can therefore be viewed as the problem of defining a design language and then devising ways of controlling the size of the resulting design space. We propose that an effective means of controlling complexity is to use knowledge of existing designs to guide the synthesis of novel designs. We illustrate this concept with a program that designs novel mechanical fasteners from knowledge of existing fasteners. We analyze this experiment and highlight several areas for future work.


2011 ◽  
Vol 186 ◽  
pp. 621-625
Author(s):  
Ming Yang Sun ◽  
Wei Feng Sun ◽  
Xi Dong Liu ◽  
Lei Xue

Recommendation algorithms suffer the quality from the huge and sparse dataset. Memory-based collaborative filtering method has addressed the problem of sparsity by predicting unrated values. However, this method increases the computational complexity, sparsity and expensive complexity of computation are trade-off. In this paper, we propose a novel personalized filtering (PF) recommendation algorithm based on collaborative tagging, which weights the feature of tags that show latent personal interests and constructs a top-N tags set to filter out the undersized and dense dataset. The PF recommendation algorithm can track the changes of personal interests, which is an untilled field for previous studies. The results of empirical experiments show that the sparsity level of PF recommendation algorithm is much lower, and it is more computationally economic than previous algorithms.


Sign in / Sign up

Export Citation Format

Share Document