scholarly journals A Convex Optimization Approach for the Design of Supergain Electrically Small Antenna and Rectenna Arrays Comprising Parasitic Reactively Loaded Elements

Author(s):  
Apostolos Georgiadis ◽  
Nuno Borges Carvalho

<div><div><div><p>A convex optimization formulation is provided for antenna arrays comprising reactively loaded parasitic elements. The objective function consists of maximizing the array gain, while constraints on the admittance are provided in order to properly account for reactive loads. Topologies with two and three electrically small dipole arrays comprising one fed element and one or two parasitic elements respectively are considered and the conditions for obtaining supergain are investigated. The admittance constraints are formulated as linear constraints for specific cases as well as more general, quadratic constraints, which lead to the solution of an equivalent convex relaxation formulation. A design example for an electrically small superdirective rectenna is provided where an upper bound for the rectifier efficiency is simulated.</p></div></div></div>

2021 ◽  
Author(s):  
Apostolos Georgiadis ◽  
Nuno Borges Carvalho

<div><div><div><p>A convex optimization formulation is provided for antenna arrays comprising reactively loaded parasitic elements. The objective function consists of maximizing the array gain, while constraints on the admittance are provided in order to properly account for reactive loads. Topologies with two and three electrically small dipole arrays comprising one fed element and one or two parasitic elements respectively are considered and the conditions for obtaining supergain are investigated. The admittance constraints are formulated as linear constraints for specific cases as well as more general, quadratic constraints, which lead to the solution of an equivalent convex relaxation formulation. A design example for an electrically small superdirective rectenna is provided where an upper bound for the rectifier efficiency is simulated.</p></div></div></div>


2018 ◽  
Vol 12 (13) ◽  
pp. 2001-2006 ◽  
Author(s):  
Mohammad Ranjbar Nikkhah ◽  
Mohammad Ali Panahi ◽  
Hung Luyen ◽  
Hamid Bahrami ◽  
Nader Behdad

Symmetry ◽  
2021 ◽  
Vol 13 (10) ◽  
pp. 1824
Author(s):  
Claudiu Popescu ◽  
Lacrimioara Grama ◽  
Corneliu Rusu

The paper describes a convex optimization formulation of the extractive text summarization problem and a simple and scalable algorithm to solve it. The optimization program is constructed as a convex relaxation of an intuitive but computationally hard integer programming problem. The objective function is highly symmetric, being invariant under unitary transformations of the text representations. Another key idea is to replace the constraint on the number of sentences in the summary with a convex surrogate. For solving the program we have designed a specific projected gradient descent algorithm and analyzed its performance in terms of execution time and quality of the approximation. Using the datasets DUC 2005 and Cornell Newsroom Summarization Dataset, we have shown empirically that the algorithm can provide competitive results for single document summarization and multi-document query-based summarization. On the Cornell Newsroom Summarization Dataset, it ranked second among the unsupervised methods tested. For the more challenging task of multi-document query-based summarization, the method was tested on the DUC 2005 Dataset. Our algorithm surpassed the other reported methods with respect to the ROUGE-SU4 metric, and it was at less than 0.01 from the top performing algorithms with respect to ROUGE-1 and ROUGE-2 metrics.


2015 ◽  
Vol 57 (10) ◽  
pp. 2269-2274 ◽  
Author(s):  
Abdullah Haskou ◽  
Sylvain Collardey ◽  
Ala Sharaiha

Sign in / Sign up

Export Citation Format

Share Document