complexity analysis
Recently Published Documents


TOTAL DOCUMENTS

1716
(FIVE YEARS 407)

H-INDEX

44
(FIVE YEARS 10)

Author(s):  
Nikita Doikov ◽  
Yurii Nesterov

AbstractIn this paper, we develop new affine-invariant algorithms for solving composite convex minimization problems with bounded domain. We present a general framework of Contracting-Point methods, which solve at each iteration an auxiliary subproblem restricting the smooth part of the objective function onto contraction of the initial domain. This framework provides us with a systematic way for developing optimization methods of different order, endowed with the global complexity bounds. We show that using an appropriate affine-invariant smoothness condition, it is possible to implement one iteration of the Contracting-Point method by one step of the pure tensor method of degree $$p \ge 1$$ p ≥ 1 . The resulting global rate of convergence in functional residual is then $${\mathcal {O}}(1 / k^p)$$ O ( 1 / k p ) , where k is the iteration counter. It is important that all constants in our bounds are affine-invariant. For $$p = 1$$ p = 1 , our scheme recovers well-known Frank–Wolfe algorithm, providing it with a new interpretation by a general perspective of tensor methods. Finally, within our framework, we present efficient implementation and total complexity analysis of the inexact second-order scheme $$(p = 2)$$ ( p = 2 ) , called Contracting Newton method. It can be seen as a proper implementation of the trust-region idea. Preliminary numerical results confirm its good practical performance both in the number of iterations, and in computational time.


Sensors ◽  
2022 ◽  
Vol 22 (1) ◽  
pp. 378
Author(s):  
Gustaw Mazurek

Digital Audio Broadcast (DAB) transmitters can be successfully used as illumination sources in Passive Coherent Location (PCL). However, extending the integration time in such a configuration leads to the occurrence of periodical artifacts in the bistatic range/Doppler plots, resulting from the time structure of the DAB signal. In this paper, we propose some methods of signal preprocessing (based on symbol removal, substitution by noise, and duplication) that operate on the DAB transmission frame level and improve the received signal’s correlation properties. We also demonstrate that two of these methods allow us to avoid the mentioned artifacts and thus to improve the quality of the range/Doppler plots with detection results. We evaluate the performance of the proposed methods using real DAB signals acquired in an experimental PCL platform. We also provide the analysis of the Signal to Noise Ratio (SNR) during the detection of a moving target which shows that the proposed solution, based on symbol duplication, can offer around 3 dB of gain in SNR. Finally, we carry out the computational complexity analysis showing that the proposed method can be implemented with a minimal cost after some optimizations.


2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Dan Zhang

With the rapid development of mobile internet technology, there are a large number of unstructured data in dynamic data, such as text data, multimedia data, etc., so it is essential to analyze and process these unstructured data to obtain potentially valuable information. This article first starts with the theoretical research of text complexity analysis and analyzes the source of text complexity and its five characteristics of dynamic, complexity, concealment, sentiment, and ambiguity, combined with the expression of user needs in the network environment. Secondly, based on the specific process of text mining, namely, data collection, data processing, and data visualization, it is proposed to subdivide the user demand analysis into three stages of text complexity acquisition, recognition, and expression, to obtain a text complexity analysis based on text mining technology. After that, based on computational linguistics and mathematical-statistical analysis, combined with machine learning and information retrieval technology, the text in any format is converted into a content format that can be used for machine learning, and patterns or knowledge are derived from this content format. Then, through the comparison and research of text mining technology, combined with the text complexity analysis hierarchical structure model, a quantitative relationship complexity analysis framework based on text mining technology is proposed, which is embodied in the use of web crawler technology. Experimental results show that the collected quantitative relationship information is identified and expressed in order to realize the conversion of quantitative relationship information into product features. The market data and text data can be integrated to help improve the model performance and the use of text data can further improve predictions for accuracy.


Entropy ◽  
2021 ◽  
Vol 24 (1) ◽  
pp. 26
Author(s):  
Hongjian Xiao ◽  
Danilo P. Mandic

Entropy-based methods have received considerable attention in the quantification of structural complexity of real-world systems. Among numerous empirical entropy algorithms, conditional entropy-based methods such as sample entropy, which are associated with amplitude distance calculation, are quite intuitive to interpret but require excessive data lengths for meaningful evaluation at large scales. To address this issue, we propose the variational embedding multiscale sample entropy (veMSE) method and conclusively demonstrate its ability to operate robustly, even with several times shorter data than the existing conditional entropy-based methods. The analysis reveals that veMSE also exhibits other desirable properties, such as the robustness to the variation in embedding dimension and noise resilience. For rigor, unlike the existing multivariate methods, the proposed veMSE assigns a different embedding dimension to every data channel, which makes its operation independent of channel permutation. The veMSE is tested on both stimulated and real world signals, and its performance is evaluated against the existing multivariate multiscale sample entropy methods. The proposed veMSE is also shown to exhibit computational advantages over the existing amplitude distance-based entropy methods.


2021 ◽  
pp. 155005942110640
Author(s):  
Fatih Hilmi Çetin ◽  
Miraç Barış Usta ◽  
Serap Aydın ◽  
Ahmet Sami Güven

Objective: Complexity analysis is a method employed to understand the activity of the brain. The effect of methylphenidate (MPH) treatment on neuro-cortical complexity changes is still unknown. This study aimed to reveal how MPH treatment affects the brain complexity of children with attention deficit hyperactivity disorder (ADHD) using entropy-based quantitative EEG analysis. Three embedding entropy approaches were applied to short segments of both pre- and post- medication EEG series. EEG signals were recorded for 25 boys with combined type ADHD prior to the administration of MPH and at the end of the first month of the treatment. Results: In comparison to Approximate Entropy (ApEn) and Sample Entropy (SampEn), Permutation Entropy (PermEn) provided the most sensitive estimations in investigating the impact of MPH treatment. In detail, the considerable decrease in EEG complexity levels were observed at six cortical regions (F3, F4, P4, T3, T6, O2) with statistically significant level ( p < .05). As well, PermEn provided the most meaningful associations at central lobes as follows: 1) The largeness of EEG complexity levels was moderately related to the severity of ADHD symptom detected at pre-treatment stage. 2) The percentage change in the severity of opposition as the symptom cluster was moderately reduced by the change in entropy. Conclusion: A significant decrease in entropy levels in the frontal region was detected in boys with combined type ADHD undergoing MPH treatment at resting-state mode. The changes in entropy correlated with pre-treatment general symptom severity of ADHD and conduct disorder symptom cluster severity.


2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Yuxing Li ◽  
Shangbin Jiao ◽  
Bo Geng ◽  
Xinru Jiang

Dispersion entropy (DE), as a newly proposed entropy, has achieved remarkable results in its application. In this paper, on the basis of DE, combined with coarse-grained processing, we introduce the fluctuation and distance information of signal and propose the refined composite multiscale fluctuation-based reverse dispersion entropy (RCMFRDE). As an emerging complexity analysis mode, RCMFRDE has been used for the first time for the feature extraction of ship-radiated noise signals to mitigate the loss caused by the misclassification of ships on the ocean. Meanwhile, a classification and recognition method combined with K-nearest neighbor (KNN) came into being, namely, RCMFRDE-KNN. The experimental results indicated that RCMFRDE has the highest recognition rate in the single feature case and up to 100% in the double feature case, far better than multiscale DE (MDE), multiscale fluctuation-based DE (MFDE), multiscale permutation entropy (MPE), and multiscale reverse dispersion entropy (MRDE), and all the experimental results show that the RCMFRDE proposed in this paper improves the separability of the commonly used entropy in the hydroacoustic domain.


Author(s):  
Anne-Carole Honfoga ◽  
Michel Dossou ◽  
Veronique Moeyaert
Keyword(s):  

Author(s):  
Riley Badenbroek ◽  
Etienne de Klerk

We develop a short-step interior point method to optimize a linear function over a convex body assuming that one only knows a membership oracle for this body. The approach is based a sketch of a universal interior point method using the so-called entropic barrier. It is well known that the gradient and Hessian of the entropic barrier can be approximated by sampling from Boltzmann-Gibbs distributions and the entropic barrier was shown to be self-concordant. The analysis of our algorithm uses properties of the entropic barrier, mixing times for hit-and-run random walks, approximation quality guarantees for the mean and covariance of a log-concave distribution, and results on inexact Newton-type methods.


Sign in / Sign up

Export Citation Format

Share Document