scholarly journals The Optimization Algorithm for CR System Based on Optimal Wavelet Filter

2019 ◽  
Vol 2019 ◽  
pp. 1-6
Author(s):  
Miao Liu ◽  
Zhenxing Sun ◽  
Yan-chang Liu ◽  
Cun Zhao

5G network is a heterogeneous large-scale network. Cognitive Radio (CR) technology can be used to realize selection based on the communication time, communication resources, and communication requirement so as to improve the system performance of the whole communication system. Cognitive Radio (CR) system based on wavelet packet owns better flexibility and bandwidth efficiency. The optimal wavelet packet filter optimization algorithm is proposed in the paper for guaranteeing un-licensed user’s (un-LU) data transmission rate and optimizing the performance of system. The intelligent search algorithm is used to obtain the optimal wavelet filter. The simulation results show that the Intercarrier Interference (ICI) and bit error rate (BER) performance of the new optimal wavelet filter algorithm without sacrificing any un-LU’s subcarriers is better than other three masking subcarriers algorithms.

SPE Journal ◽  
2006 ◽  
Vol 11 (01) ◽  
pp. 5-17 ◽  
Author(s):  
Guohua Gao ◽  
Albert C. Reynolds

Summary For large scale history matching problems, where it is not feasible to compute individual sensitivity coefficients, the limited memory Broyden-Fletcher-Goldfarb-Shanno (LBFGS) is an efficient optimization algorithm, (Zhang and Reynolds, 2002; Zhang, 2002). However, computational experiments reveal that application of the original implementation of LBFGS may encounter the following problems:converge to a model which gives an unacceptable match of production data;generate a bad search direction that either leads to false convergence or a restart with the steepest descent direction which radically reduces the convergence rate;exhibit overshooting and undershooting, i.e., converge to a vector of model parameters which contains some abnormally high or low values of model parameters which are physically unreasonable. Overshooting and undershooting can occur even though all history matching problems are formulated in a Bayesian framework with a prior model providing regularization. We show that the rate of convergence and the robustness of the algorithm can be significantly improved by:a more robust line search algorithm motivated by the theoretical result that the Wolfe conditions should be satisfied;an application of a data damping procedure at early iterations orenforcing constraints on the model parameters. Computational experiments also indicate thata simple rescaling of model parameters prior to application of the optimization algorithm can improve the convergence properties of the algorithm although the scaling procedure used can not be theoretically validated. Introduction Minimization of a smooth objective function is customarily done using a gradient based optimization algorithm such as the Gauss- Newton (GN) method or Levenberg-Marquardt (LM) algorithm. The standard implementations of these algorithms (Tan and Kalogerakis, 1991; Wu et al., 1999; Li et al., 2003), however, require the computation of all sensitivity coefficients in order to formulate the Hessian matrix. We are interested in history matching problems where the number of data to be matched ranges from a few hundred to several thousand and the number of reservoir variables or model parameters to be estimated or simulated ranges from a few hundred to a hundred thousand or more. For the larger problems in this range, the computer resources required to compute all sensitivity coefficients would prohibit the use of the standard Gauss- Newton and Levenberg-Marquardt algorithms. Even for the smallest problems in this range, computation of all sensitivity coefficients may not be feasible as the resulting GN and LM algorithms may require the equivalent of several hundred simulation runs. The relative computational efficiency of GN, LM, nonlinear conjugate gradient and quasi-Newton methods have been discussed in some detail by Zhang and Reynolds (2002) and Zhang (2002).


2018 ◽  
Vol 2018 ◽  
pp. 1-6 ◽  
Author(s):  
Liu Miao ◽  
Zhenxing Sun ◽  
Zhang Jie

The intercarrier interference (ICI) problem of cognitive radio (CR) is severe. In this paper, the machine learning algorithm is used to obtain the optimal interference subcarriers of an unlicensed user (un-LU). Masking the optimal interference subcarriers can suppress the ICI of CR. Moreover, the parallel ICI suppression algorithm is designed to improve the calculation speed and meet the practical requirement of CR. Simulation results show that the data transmission rate threshold of un-LU can be set, the data transmission quality of un-LU can be ensured, the ICI of a licensed user (LU) is suppressed, and the bit error rate (BER) performance of LU is improved by implementing the parallel suppression algorithm. The ICI problem of CR is solved well by the new machine learning algorithm. The computing performance of the algorithm is improved by designing a new parallel structure and the communication performance of CR is enhanced.


2020 ◽  
Author(s):  
Anusha Ampavathi ◽  
Vijaya Saradhi T

UNSTRUCTURED Big data and its approaches are generally helpful for healthcare and biomedical sectors for predicting the disease. For trivial symptoms, the difficulty is to meet the doctors at any time in the hospital. Thus, big data provides essential data regarding the diseases on the basis of the patient’s symptoms. For several medical organizations, disease prediction is important for making the best feasible health care decisions. Conversely, the conventional medical care model offers input as structured that requires more accurate and consistent prediction. This paper is planned to develop the multi-disease prediction using the improvised deep learning concept. Here, the different datasets pertain to “Diabetes, Hepatitis, lung cancer, liver tumor, heart disease, Parkinson’s disease, and Alzheimer’s disease”, from the benchmark UCI repository is gathered for conducting the experiment. The proposed model involves three phases (a) Data normalization (b) Weighted normalized feature extraction, and (c) prediction. Initially, the dataset is normalized in order to make the attribute's range at a certain level. Further, weighted feature extraction is performed, in which a weight function is multiplied with each attribute value for making large scale deviation. Here, the weight function is optimized using the combination of two meta-heuristic algorithms termed as Jaya Algorithm-based Multi-Verse Optimization algorithm (JA-MVO). The optimally extracted features are subjected to the hybrid deep learning algorithms like “Deep Belief Network (DBN) and Recurrent Neural Network (RNN)”. As a modification to hybrid deep learning architecture, the weight of both DBN and RNN is optimized using the same hybrid optimization algorithm. Further, the comparative evaluation of the proposed prediction over the existing models certifies its effectiveness through various performance measures.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Xi Huo ◽  
Jing Chen ◽  
Shigui Ruan

Abstract Background The COVID-19 outbreak in Wuhan started in December 2019 and was under control by the end of March 2020 with a total of 50,006 confirmed cases by the implementation of a series of nonpharmaceutical interventions (NPIs) including unprecedented lockdown of the city. This study analyzes the complete outbreak data from Wuhan, assesses the impact of these public health interventions, and estimates the asymptomatic, undetected and total cases for the COVID-19 outbreak in Wuhan. Methods By taking different stages of the outbreak into account, we developed a time-dependent compartmental model to describe the dynamics of disease transmission and case detection and reporting. Model coefficients were parameterized by using the reported cases and following key events and escalated control strategies. Then the model was used to calibrate the complete outbreak data by using the Monte Carlo Markov Chain (MCMC) method. Finally we used the model to estimate asymptomatic and undetected cases and approximate the overall antibody prevalence level. Results We found that the transmission rate between Jan 24 and Feb 1, 2020, was twice as large as that before the lockdown on Jan 23 and 67.6% (95% CI [0.584,0.759]) of detectable infections occurred during this period. Based on the reported estimates that around 20% of infections were asymptomatic and their transmission ability was about 70% of symptomatic ones, we estimated that there were about 14,448 asymptomatic and undetected cases (95% CI [12,364,23,254]), which yields an estimate of a total of 64,454 infected cases (95% CI [62,370,73,260]), and the overall antibody prevalence level in the population of Wuhan was 0.745% (95% CI [0.693%,0.814%]) by March 31, 2020. Conclusions We conclude that the control of the COVID-19 outbreak in Wuhan was achieved via the enforcement of a combination of multiple NPIs: the lockdown on Jan 23, the stay-at-home order on Feb 2, the massive isolation of all symptomatic individuals via newly constructed special shelter hospitals on Feb 6, and the large scale screening process on Feb 18. Our results indicate that the population in Wuhan is far away from establishing herd immunity and provide insights for other affected countries and regions in designing control strategies and planing vaccination programs.


2021 ◽  
Vol 11 (3) ◽  
pp. 1286 ◽  
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Ali Dehghani ◽  
Om P. Malik ◽  
Ruben Morales-Menendez ◽  
...  

One of the most powerful tools for solving optimization problems is optimization algorithms (inspired by nature) based on populations. These algorithms provide a solution to a problem by randomly searching in the search space. The design’s central idea is derived from various natural phenomena, the behavior and living conditions of living organisms, laws of physics, etc. A new population-based optimization algorithm called the Binary Spring Search Algorithm (BSSA) is introduced to solve optimization problems. BSSA is an algorithm based on a simulation of the famous Hooke’s law (physics) for the traditional weights and springs system. In this proposal, the population comprises weights that are connected by unique springs. The mathematical modeling of the proposed algorithm is presented to be used to achieve solutions to optimization problems. The results were thoroughly validated in different unimodal and multimodal functions; additionally, the BSSA was compared with high-performance algorithms: binary grasshopper optimization algorithm, binary dragonfly algorithm, binary bat algorithm, binary gravitational search algorithm, binary particle swarm optimization, and binary genetic algorithm. The results show the superiority of the BSSA. The results of the Friedman test corroborate that the BSSA is more competitive.


2021 ◽  
Vol 11 (10) ◽  
pp. 4382
Author(s):  
Ali Sadeghi ◽  
Sajjad Amiri Doumari ◽  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Pavel Trojovský ◽  
...  

Optimization is the science that presents a solution among the available solutions considering an optimization problem’s limitations. Optimization algorithms have been introduced as efficient tools for solving optimization problems. These algorithms are designed based on various natural phenomena, behavior, the lifestyle of living beings, physical laws, rules of games, etc. In this paper, a new optimization algorithm called the good and bad groups-based optimizer (GBGBO) is introduced to solve various optimization problems. In GBGBO, population members update under the influence of two groups named the good group and the bad group. The good group consists of a certain number of the population members with better fitness function than other members and the bad group consists of a number of the population members with worse fitness function than other members of the population. GBGBO is mathematically modeled and its performance in solving optimization problems was tested on a set of twenty-three different objective functions. In addition, for further analysis, the results obtained from the proposed algorithm were compared with eight optimization algorithms: genetic algorithm (GA), particle swarm optimization (PSO), gravitational search algorithm (GSA), teaching–learning-based optimization (TLBO), gray wolf optimizer (GWO), and the whale optimization algorithm (WOA), tunicate swarm algorithm (TSA), and marine predators algorithm (MPA). The results show that the proposed GBGBO algorithm has a good ability to solve various optimization problems and is more competitive than other similar algorithms.


2021 ◽  
Vol 11 (10) ◽  
pp. 4438
Author(s):  
Satyendra Singh ◽  
Manoj Fozdar ◽  
Hasmat Malik ◽  
Maria del Valle Fernández Moreno ◽  
Fausto Pedro García Márquez

It is expected that large-scale producers of wind energy will become dominant players in the future electricity market. However, wind power output is irregular in nature and it is subjected to numerous fluctuations. Due to the effect on the production of wind power, producing a detailed bidding strategy is becoming more complicated in the industry. Therefore, in view of these uncertainties, a competitive bidding approach in a pool-based day-ahead energy marketplace is formulated in this paper for traditional generation with wind power utilities. The profit of the generating utility is optimized by the modified gravitational search algorithm, and the Weibull distribution function is employed to represent the stochastic properties of wind speed profile. The method proposed is being investigated and simplified for the IEEE-30 and IEEE-57 frameworks. The results were compared with the results obtained with other optimization methods to validate the approach.


2021 ◽  
Vol 11 (7) ◽  
pp. 3083
Author(s):  
Youheng Tan ◽  
Xiaojun Jing

Spectrum sensing (SS) has attracted much attention due to its important role in the improvement of spectrum efficiency. However, the limited sensing time leads to an insufficient sampling point due to the tradeoff between sensing time and communication time. Although the sensing performance of cooperative spectrum sensing (CSS) is greatly improved by mutual cooperation between cognitive nodes, it is at the expense of computational complexity. In this paper, efficient approximations of the N-out-of-K rule-based CSS scheme under heterogeneous cognitive radio networks are provided to obtain the closed-form expression of the sensing threshold at the fusion center (FC), where the false alarm probability and its corresponding detection probability are approximated by the Poisson distribution. The computational complexity required to obtain the optimal sensing threshold at the FC has greatly decreased and theoretical derivations state that the approximation error is negligible. The simulations validate the effectiveness of the proposed scheme.


Sign in / Sign up

Export Citation Format

Share Document