normalization function
Recently Published Documents


TOTAL DOCUMENTS

15
(FIVE YEARS 3)

H-INDEX

5
(FIVE YEARS 0)

Entropy ◽  
2021 ◽  
Vol 23 (9) ◽  
pp. 1205
Author(s):  
Amnon Moalem ◽  
Alexander Gersten

Quantum equations for massless particles of any spin are considered in stationary uncharged axially symmetric spacetimes. It is demonstrated that up to a normalization function, the angular wave function does not depend on the metric and practically is the same as in the Minkowskian case. The radial wave functions satisfy second order nonhomogeneous differential equations with three nonhomogeneous terms, which depend in a unique way on time and space curvatures. In agreement with the principle of equivalence, these terms vanish locally, and the radial equations reduce to the same homogeneous equations as in Minkowski spacetime.


2020 ◽  
Vol 37 (5) ◽  
pp. 733-743
Author(s):  
Mohammad Abid Al-Hashim ◽  
Zohair Al-Ameen

These days, digital images are one of the most profound methods used to represent information. Still, various images are obtained with a low-light effect due to numerous unavoidable reasons. It may be problematic for humans and computer-related applications to perceive and extract valuable information from such images properly. Hence, the observed quality of low-light images should be ameliorated for improved analysis, understanding, and interpretation. Currently, the enhancement of low-light images is a challenging task since various factors, including brightness, contrast, and colors should be considered effectively to produce results with adequate quality. Therefore, a retinex-based multiphase algorithm is developed in this study, in that it computes the illumination image somewhat similar to the single-scale retinex algorithm, takes the logs of both the original and the illumination images, subtract them using a modified approach, the result is then processed by a gamma-corrected sigmoid function and further processed by a normalization function to produce to the final result. The proposed algorithm is tested using natural low-light images, evaluated using specialized metrics, and compared with eight different sophisticated methods. The attained experiential outcomes revealed that the proposed algorithm has delivered the best performances concerning processing speed, perceived quality, and evaluation metrics.


Information ◽  
2020 ◽  
Vol 11 (10) ◽  
pp. 479
Author(s):  
Yuzhen Gao ◽  
Youdong Ding ◽  
Fei Wang ◽  
Huan Liang

We propose a novel end-to-end image colorization framework which integrates attention mechanism and a learnable adaptive normalization function. In contrast to previous colorization methods that directly generate the whole image, we believe that the color of the significant area determines the quality of the colorized image. The attention mechanism uses the attention map which is obtained by the auxiliary classifier to guide our framework to produce more subtle content and visually pleasing color in salient visual regions. Furthermore, we apply Adaptive Group Instance Normalization (AGIN) function to promote our framework to generate vivid colorized images flexibly, under the circumstance that we consider colorization as a particular style transfer task. Experiments show that our model is superior to previous the state-of-the-art models in coloring foreground objects.


Author(s):  
Armand Armand ◽  
André Totohasina ◽  
Daniel Rajaonasy Feno

Regarding the existence of more than sixty interestingness measures proposed in the literature since 1993 till today in the topics of association rules mining and facing the importance these last one, the research on normalization probabilistic quality measures of association rules has already led to many tangible results to consolidate the various existing measures in the literature. This article recommends a simple way to perform this normalization. In the interest of a unified presentation, the article offers also a new concept of normalization function as an effective tool for resolution of the problem of normalization measures that have already their own normalization functions.


2016 ◽  
Vol 63 (1) ◽  
pp. 7-18 ◽  
Author(s):  
Marek Walesiak

In multidimensional scaling carried out on the basis of metric data matrix (interval, ratio) one of the stages is the choice of the variable normalization method. The R package clusterSim with data. Normalization function has been developed for that purpose. It provides 18 data normalization methods. In this paper the proposal of procedure which allows to isolate groups of normalization methods that lead to similar multidimensional scaling results were presented. The proposal can reduce the problem of choosing the normalization method in multidimensional scaling. The results are illustrated via empirical example.


2006 ◽  
Vol 6 (12) ◽  
pp. 4763-4773 ◽  
Author(s):  
C. Vogler ◽  
S. Brönnimann ◽  
G. Hansen

Abstract. The historical total ozone measurements taken with Dobson Spectrophotometer #8 at Longyearbyen (78.2° N, 15.6° E), Svalbard, Norway, in the period 1950–1962 have been re-analyzed and homogenized based on the original measurement logs, using present-day procedures. In lack of sufficient calibration information, an empirical quality assessment was performed, based on a climatological comparison with ozone measurements in Tromsø, using TOMS data at both sites in the period 1979–2001, and ground-based Dobson data in the period 1950–1962. The assessment revealed that the C wavelength pair direct-sun (DS) measurements are most trustworthy (and most frequent), while the WMO standard reference mode AD direct-sun has a systematic bias. Zenith-blue (ZB) measurements at solar zenith angles (SZA) <78° were adjusted to DS data using different empirical functions before and after 1957 (the start of the International Geophysical Year). ZB measurements at larger SZAs were homogenized by means of a normalization function derived from days with measurements over a wide range of SZAs. Zenith-cloudy measurements, which are particularly frequent during the summer months, were homogenized by applying correction factors depending on the cloud type (high thin clouds and medium to low thick clouds). The combination of all measurements yields a total of 4685 single values, covering 1637 days from September 1950 to September 1962; moon measurements during the polar night add another 137 daily means. The re-evaluated data show a convincing consistence with measurements since 1979 (TOMS, SAOZ, Dobson) as well as with the 1957–1962 data stored at the World Ozone and UV Data Centre (WOUDC).


2006 ◽  
Vol 10 (6) ◽  
pp. 495-500 ◽  
Author(s):  
Joseph Majdalani ◽  
Sean R. Fischbach ◽  
Gary A. Flandro

2006 ◽  
Vol 6 (3) ◽  
pp. 3913-3943 ◽  
Author(s):  
C. Vogler ◽  
S. Brönnimann ◽  
G. Hansen

Abstract. The historical total ozone measurements taken with Dobson Spectrophotometer #8 at Longyearbyen, Svalbard, Norway, in the period 1950–1962 have been re-analyzed and homogenized based on the original measurement logs, using updated relevant parameters. In lack of sufficient calibration information, an empirical quality assessment was performed, based on a climatological comparison with ozone measurements in Tromsø, using TOMS data at both sites in the period 1979–2001, and Dobson data in the period 1950–1962. The assessment revealed that, as in the case of the Tromsø measurements, the C wavelength pair direct-sun measurements are most trustworthy (and most frequent), while the WMO standard reference mode AD direct-sun has a systematic bias relative to this data set. Zenith-blue (ZB) measurements at solar zenith angles (SZA) <80° were homogenized using two different polynomials before and from 1957; also ZB measurements at larger SZAs were homogenized by means of a normalization function derived from days with measurements over a wide range of SZAs. CC' zenith-cloudy measurements, which are particularly frequent during the summer months, were homogenized by applying correction factors for only two different cloud types: high thin clouds and medium/low/thick clouds; a further diversification of corrections reflecting cloud conditions did not prove significant. The combination of all measurements yields a total of 4837 single values, covering 1676 days from September 1950 to September 1962; moon measurements during the polar night add another 137 daily means. The re-evaluated data show a convincing agreement with measurements since 1979 (TOMS, SAOZ, Dobson) as well as with the 1957–1962 data stored at the World Ozone and UV Data Centre (WOUDC).


2005 ◽  
Vol 12 (4) ◽  
Author(s):  
Andrzej Filinski ◽  
Henning Korsholm Rohde

We show that the standard normalization-by-evaluation construction for the simply-typed lambda_{beta eta}-calculus has a natural counterpart for the untyped lambda_beta-calculus, with the central type-indexed logical relation replaced by a "recursively defined'' <em>invariant relation</em>, in the style of Pitts. In fact, the construction can be seen as generalizing a computational-adequacy argument for an untyped, call-by-name language to normalization instead of evaluation.<br /> <br />In the untyped setting, not all terms have normal forms, so the normalization function is necessarily partial. We establish its correctness in the senses of <em>soundness</em> (the output term, if any, is in normal form and beta-equivalent to the input term); <em>identification</em> ( beta-equivalent terms are mapped to the same result); and <em>completeness</em> (the function is defined for all terms that do have normal forms). We also show how the semantic construction enables a simple yet formal correctness proof for the normalization algorithm, expressed as a functional program in an ML-like call-by-value language.<br /> <br />Finally, we generalize the construction to produce an infinitary variant of normal forms, namely <em>Böhm trees</em>. We show that the three-part characterization of correctness, as well as the proofs, extend naturally to this generalization.


Sign in / Sign up

Export Citation Format

Share Document