scholarly journals Weighted Inequality

2014 ◽  
Vol 10 (1) ◽  
pp. 121-124
Author(s):  
Santosh Ghimire

 In this paper, we define Ap weights, briefly discuss the theory of weighted inequalities and its application and importance in various fields. We then prove that for an Ap weight function w and for some , the function, min(w, k) is an Ap weight function. Finally we establish the weighted inequality for min(w, k).DOI: http://dx.doi.org/10.3126/jie.v10i1.10887Journal of the Institute of Engineering, Vol. 10, No. 1, 2014 pp. 121–124

Author(s):  
Vinod Parajuli ◽  
Santosh Ghimire

<p>In this paper, we begin with brief discussion of theory of weights and Ap weight functions. We then state and prove some of the properties of AP weight function using elementary analysis tools. </p><p><strong>Journal of Advanced College of Engineering and Management</strong>, Vol. 3, 2017, Page: 111-114</p>


2011 ◽  
Vol 141 (5) ◽  
pp. 1071-1081 ◽  
Author(s):  
Dah-Chin Luor

A characterization is obtained on weight function u so that $\smash{T\colon L_{p}^+\mapsto L_{q,u}^+}$ is bounded for 1 < p < ∞ and 0 < q < ∞, where T are integral operators and related maximal operators, and for 0 < p, q < ∞, where T are geometric mean operators and related geometric maximal operators. The equivalence of such weighted inequalities for these operators are established.


2016 ◽  
Vol 11 (1) ◽  
pp. 116-119
Author(s):  
Vinod Parajuli ◽  
Santosh Ghimire

In this paper, we first define Ap weight functions and then show that finite product of weight functions each raised with some power whose sum is one is also an Ap weight function.  Journal of the Institute of Engineering, 2015, 11(1): 116-119


2017 ◽  
Vol 12 (1) ◽  
pp. 210-213
Author(s):  
Santosh Ghimire

In this paper, we briefly discuss the theory of weights and then define A1 and Ap weight functions. Finally we prove some of the properties of Ap weight function. Journal of the Institute of Engineering, 2016, 12(1): 210-213


2016 ◽  
Vol 34 (1-2) ◽  
pp. 19-23
Author(s):  
Durga Jang K.C. ◽  
Santosh Ghimire

In this paper, we relate Bounded Mean Oscillation (BMO) function and A2 weight function. We show that logarithm of any A2 function is a BMO function and every BMO function is equal to a constant multiple of the logarithm of an A2 weight function. Moreover, we show that logarithm of any Ap weight function for 1 < p < ∞ is a BMO function.


2002 ◽  
Author(s):  
Shyhnan Liou ◽  
Chung-Ping Cheng
Keyword(s):  

2020 ◽  
Author(s):  
Anusha Ampavathi ◽  
Vijaya Saradhi T

UNSTRUCTURED Big data and its approaches are generally helpful for healthcare and biomedical sectors for predicting the disease. For trivial symptoms, the difficulty is to meet the doctors at any time in the hospital. Thus, big data provides essential data regarding the diseases on the basis of the patient’s symptoms. For several medical organizations, disease prediction is important for making the best feasible health care decisions. Conversely, the conventional medical care model offers input as structured that requires more accurate and consistent prediction. This paper is planned to develop the multi-disease prediction using the improvised deep learning concept. Here, the different datasets pertain to “Diabetes, Hepatitis, lung cancer, liver tumor, heart disease, Parkinson’s disease, and Alzheimer’s disease”, from the benchmark UCI repository is gathered for conducting the experiment. The proposed model involves three phases (a) Data normalization (b) Weighted normalized feature extraction, and (c) prediction. Initially, the dataset is normalized in order to make the attribute's range at a certain level. Further, weighted feature extraction is performed, in which a weight function is multiplied with each attribute value for making large scale deviation. Here, the weight function is optimized using the combination of two meta-heuristic algorithms termed as Jaya Algorithm-based Multi-Verse Optimization algorithm (JA-MVO). The optimally extracted features are subjected to the hybrid deep learning algorithms like “Deep Belief Network (DBN) and Recurrent Neural Network (RNN)”. As a modification to hybrid deep learning architecture, the weight of both DBN and RNN is optimized using the same hybrid optimization algorithm. Further, the comparative evaluation of the proposed prediction over the existing models certifies its effectiveness through various performance measures.


Sign in / Sign up

Export Citation Format

Share Document