natural information
Recently Published Documents


TOTAL DOCUMENTS

63
(FIVE YEARS 15)

H-INDEX

8
(FIVE YEARS 1)

Author(s):  
Rachana B ◽  
Kavya Hegde ◽  
Navya Bhat

The objective is naturally recognize which types of bird is available in a sound data set utilizing regulated learning. Contriving successful calculations for bird species order is a fundamental advance toward separating valuable natural information from accounts gathered in the field. Here Naïve Bayes calculation to characterize bird voices into various species dependent on 265 highlights removed from the chipping sound of birds. The difficulties in this undertaking included memory the executives, the quantity of bird species for the machine perceive, and the jumble in signal-to-clamor proportion between the preparation and the testing sets. So to settle this difficulties we utilized Naïve Bayes calculation from this we got great precision in it. The calculation Naive Bayes got 91.58% exactness.


2020 ◽  
Vol 50 (4) ◽  
pp. 04-05
Author(s):  
Procopio Cocci

The objective of the ecological building instruction ought not just train understudies' natural information, the more significant thing is that it prepares understudies' natural ethics and structures the conduct which is good for the earth, and these must be shaped by training, in actuality. In the customary showing model of training, one instructor can just guide one practice simultaneously. With the improvement of organization innovation, instructor can control the distinctive practice exercises firing up in various areas or in various occasions by network. In light of the incorporation of viable need and intuitive qualities of condition instruction, the creator set forward an online domain training mode named "practice-intelligent partake in". The Core of this mode is to prepare understudies' natural ethics by training and to understand educators' guidance through organization.


2020 ◽  
Author(s):  
Vasil Penchev

A historical review and philosophical look at the introduction of “negative probability” as well as “complex probability” is suggested. The generalization of “probability” is forced by mathematical models in physical or technical disciplines. Initially, they are involved only as an auxiliary tool to complement mathematical models to the completeness to corresponding operations. Rewards, they acquire ontological status, especially in quantum mechanics and its formulation as a natural information theory as “quantum information” after the experimental confirmation the phenomena of “entanglement”. Philosophical interpretations appear. A generalization of them is suggested: ontologically, they correspond to a relevant generalization to the relation of a part and its whole where the whole is a subset of the part rather than vice versa. The structure of “vector space” is involved necessarily in order to differ the part “by itself” from it in relation to the whole as a projection within it. That difference is reflected in the new dimension of vector space both mathematically and conceptually. Then, “negative or complex probability” are interpreted as a quantity corresponding the generalized case where the part can be “bigger” than the whole, and it is represented only partly in general within the whole.


2020 ◽  
Author(s):  
Vasil Penchev

A historical review and philosophical look at the introduction of “negative probability” as well as “complex probability” is suggested. The generalization of “probability” is forced by mathematical models in physical or technical disciplines. Initially, they are involved only as an auxiliary tool to complement mathematical models to the completeness to corresponding operations. Rewards, they acquire ontological status, especially in quantum mechanics and its formulation as a natural information theory as “quantum information” after the experimental confirmation the phenomena of “entanglement”. Philosophical interpretations appear. A generalization of them is suggested: ontologically, they correspond to a relevant generalization to the relation of a part and its whole where the whole is a subset of the part rather than vice versa. The structure of “vector space” is involved necessarily in order to differ the part “by itself” from it in relation to the whole as a projection within it. That difference is reflected in the new dimension of vector space both mathematically and conceptually. Then, “negative or complex probability” are interpreted as a quantity corresponding the generalized case where the part can be “bigger” than the whole, and it is represented only partly in general within the whole.


2020 ◽  
Author(s):  
Vasil Dinev Penchev

A historical review and philosophical look at the introduction of “negative probability” as well as “complex probability” is suggested. The generalization of “probability” is forced by mathematical models in physical or technical disciplines. Initially, they are involved only as an auxiliary tool to complement mathematical models to the completeness to corresponding operations. Rewards, they acquire ontological status, especially in quantum mechanics and its formulation as a natural information theory as “quantum information” after the experimental confirmation the phenomena of “entanglement”. Philosophical interpretations appear. A generalization of them is suggested: ontologically, they correspond to a relevant generalization to the relation of a part and its whole where the whole is a subset of the part rather than vice versa. The structure of “vector space” is involved necessarily in order to differ the part “by itself” from it in relation to the whole as a projection within it. That difference is reflected in the new dimension of vector space both mathematically and conceptually. Then, “negative or complex probability” are interpreted as a quantity corresponding the generalized case where the part can be “bigger” than the whole, and it is represented only partly in general within the whole.


Author(s):  
Rohit Rastogi ◽  
Devendra K. Chaturvedi ◽  
Parul Singhal ◽  
Mayank Gupta

Digital technology is modernizing healthcare. Large volumes of data refer to big data by digitising health information that can quickly be processed by machines. Digital healthcare analysis is the ability to diagnose and suggest ways to reduce costs; provide quality patient care and outcomes, available 24/7; reach to patients located in vast distant geographical areas; and avert preventable diseases. Artificial intelligence (AI) is an autonomous real-time machine system in comparison to natural information analyzed by humans. Diabetes is a serious, under-reported, life-threatening disease affecting millions of people of all ages, and researchers have identified it to be a major public health problem that is approaching epidemic proportions globally. The purpose of this study is to investigate diabetes analysis from CAD and other diseases using the latest advanced digital technologies to analyze information extracted from IoT and big data and stress correlation (TTH) on human health.


Author(s):  
Ryotaro Kamimura

The present paper aims to propose a new type of information-theoretic method to interpret the inference mechanism of neural networks. We interpret the internal inference mechanism for itself without any external methods such as symbolic or fuzzy rules. In addition, we make interpretation processes as stable as possible. This means that we interpret the inference mechanism, considering all internal representations, created by those different conditions and patterns. To make the internal interpretation possible, we try to compress multi-layered neural networks into the simplest ones without hidden layers. Then, the natural information loss in the process of compression is complemented by the introduction of a mutual information augmentation component. The method was applied to two data sets, namely, the glass data set and the pregnancy data set. In both data sets, information augmentation and compression methods could improve generalization performance. In addition, compressed or collective weights from the multi-layered networks tended to produce weights, ironically, similar to the linear correlation coefficients between inputs and targets, while the conventional methods such as the logistic regression analysis failed to do so.


Many end-users would agree that, had it not been for signed algorithms, the deploy- ment of e-commerce might never have occurred. Given the current status of replicated archetypes, hackers worldwide predictably desire the visualization of simulated annealing that contains natural information of networking. This paper disprove not only that vacuum tubes and symmetric encryption [14] can collaborate to address this challenge, but that it is true for linked lists.


Author(s):  
Oleksandr Drozd ◽  
Anatoliy Sachenko ◽  
Svetlana Antoshchuk ◽  
Julia Drozd ◽  
Mykola Kuznietsov

Sign in / Sign up

Export Citation Format

Share Document