Advances in Computational Intelligence and Robotics - Handbook of Research on Advanced Hybrid Intelligent Techniques and Applications
Latest Publications


TOTAL DOCUMENTS

18
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

Published By IGI Global

9781466694743, 9781466694750

Author(s):  
Ka-Chun Wong

Inspired from nature, evolutionary algorithms have been proven effective and unique in different real world applications. Comparing to traditional algorithms, its parallel search capability and stochastic nature enable it to excel in search performance in a unique way. In this chapter, evolutionary algorithms are reviewed and discussed from concepts and designs to applications in bioinformatics. The history of evolutionary algorithms is first discussed at the beginning. An overview on the state-of-the-art evolutionary algorithm concepts is then provided. Following that, the related design and implementation details are discussed on different aspects: representation, parent selection, reproductive operators, survival selection, and fitness function. At the end of this chapter, real world evolutionary algorithm applications in bioinformatics are reviewed and discussed.


Author(s):  
Mridusmita Sharma ◽  
Kandarpa Kumar Sarma

Speech is the natural communication means, however, it is not the typical input means afforded by computers. The interaction between humans and machines would have become easier, if speech were an alternative effective input means to the keyboard and mouse. With advancement in techniques for signal processing and model building and the empowerment of computing devices, significant progress has been made in speech recognition research, and various speech based applications have been developed. With rapid advancement of the speech recognition technology, telephone speech technology are getting more involved in many new applications of spoken language processing. From the literature it has been found that the spectro-temporal features gives a significant performance improvement for telephone speech recognition system in comparison to the robust feature techniques used for the recognition purpose. In this chapter, the authors have reported the use of various spectral and temporal features and the soft computing techniques that have been used for the telephonic speech recognition.


Author(s):  
J. Jagan ◽  
Prabhakar Gundlapalli ◽  
Pijush Samui

The determination of liquefaction susceptibility of soil is a paramount project in geotechnical earthquake engineering. This chapter adopts Support Vector Machine (SVM), Relevance Vector Machine (RVM) and Least Square Support Vector Machine (LSSVM) for determination of liquefaction susceptibility based on Cone Penetration Test (CPT) from Chi-Chi earthquake. Input variables of SVM, RVM and LSSVM are Cone Resistance (qc) and Peak Ground Acceleration (amax/g). SVM, RVM and LSSVM have been used as classification tools. The developed SVM, RVM and LSSVM give equations for determination of liquefaction susceptibility of soil. The comparison between the developed models has been carried out. The results show that SVM, RVM and LSSVM are the robust models for determination of liquefaction susceptibility of soil.


Author(s):  
Sandip Dey ◽  
Siddhartha Bhattacharyya ◽  
Ujjwal Maulik

In this article, a genetic algorithm inspired by quantum computing is presented. The novel algorithm referred to as quantum inspired genetic algorithm (QIGA) is applied to determine optimal threshold of two gray level images. Different random chaotic map models exhibit the inherent interference operation in collaboration with qubit and superposition of states. The random interference is followed by three different quantum operators viz., quantum crossover, quantum mutation and quantum shifting produce population diversity. Finally, the intermediate states pass through the quantum measurement for optimization of image thresholding. In the proposed algorithm three evaluation metrics such as Brinks's, Kapur's and Pun's algorithms have been applied to two gray level images viz., Lena and Barbara. These algorithms have been applied in conventional GA and Han et al.'s QEA. A comparative study has been made between the proposed QIGA, Han et al.'s algorithm and conventional GA that indicates encouraging avenues of the proposed QIGA.


Author(s):  
Sourav De ◽  
Siddhartha Bhattacharyya ◽  
Susanta Chakraborty

A self-supervised image segmentation method by a non-dominated sorting genetic algorithm-II (NSGA-II) based optimized MUSIG (OptiMUSIG) activation function with a multilayer self-organizing neural network (MLSONN) architecture is proposed to segment multilevel gray scale images. In the same way, another NSGA-II based parallel version of the OptiMUSIG (ParaOptiMUSIG) activation function with a parallel self-organizing neural network (PSONN) architecture is purported to segment the color images in this article. These methods are intended to overcome the drawback of their single objective based counterparts. Three standard objective functions are employed as the multiple objective criteria of the NSGA-II algorithm to measure the quality of the segmented images.


Author(s):  
K R Singh ◽  
M M Raghuwanshi ◽  
M A Zaveri ◽  
James F. Peters

Computer vision is a process of electronically perceiving and understanding of an image like human vision system (HVS) do. Face recognition techniques (FRT) determines the identity of the individual by matching the facial images with the one stored in the facial database. The performance of FRT is greatly affected by variations in face due to different factors. It is interesting to study how well these issues are being handled by RST and near set theory to improve the performance. The variation in illumination and plastic surgery changes the appearance of face that introduces imprecision and vagueness. One part of chapter introduces the adaptive illumination normalization technique using RST that classifies the image illumination into three classes based on which illumination normalization is performed using an appropriate filter. Later part of this chapter introduces use of near set theory for FRT on facial images that have previously undergone some feature modifications through plastic surgery.


Author(s):  
Bassem Mahmoud Mokhtar ◽  
Mohamed Eltoweissy

The ever-growing and ever-evolved Internet targets supporting billions of networked entities to provide a wide variety of services and resources. Such complexity results in network-data from different sources with special characteristics, such as widely diverse users, multiple media, high-dimensionality and various dynamic concerns. With huge amounts of network-data with such characteristics, there are significant challenges to a) recognize emergent and anomalous behavior in network-traffic and b) make intelligent decisions for efficient network operations. Endowing the semantically-oblivious Internet with Intelligence would advance the Internet capability to learn traffic behavior and to predict future events. In this chapter, the authors discuss and evaluate the hybridization of monolithic intelligence techniques in order to achieve smarter and enhanced networking operations. Additionally, the authors provide systematic application-agnostic semantics management methodology with efficient processes for extracting and classifying high-level features and reasoning about rich semantics.


Author(s):  
Petre Anghelescu

In this chapter, bio-inspired techniques based on the cellular automata (CAs) and programmable cellular automata (PCAs) theory are used to develop information security systems. The proposed cryptosystem is composed from a combination of a CA as a pseudorandom number generator (PRNG) and a PCA that construct the ciphering functions of the designed enciphering scheme. It is presented how simple elements named „cells” interact between each other using certain rules and topologies to form a larger system that can be used to encrypt/decrypt data sent over network communication systems. The proposed security system was implemented in hardware in FPGA devices of type Spartan 3E – XC3S500E and was analyzed and verified, including NIST statistical tests, to assure that the system has good security and high speed. The experimental results proves that the cryptographic techniques based on bio-inspired algorithms provides an alternative to the conventional techniques (computational methods).


Author(s):  
Shikha Mehta ◽  
Monika Bajaj ◽  
Hema Banati

Formal learning has shifted from the confines of institutional walls to our home computers and even to our mobiles. It is often felt that the concept of e-learning can be successfully applied to theoretical subjects but when it comes to teaching of science subjects like chemistry where hands on practical training is must, it is inadequate. This chapter presents a hybrid approach (amalgamation of concepts of machine learning technique with soft computing paradigm) to develop an intelligent virtual chemistry laboratory (IVCL) tool for simulating chemical experiments online. Tool presents an easy to use web based interface, which takes as input the reactants and presents results in the form of - type of reaction occurred and the list of possible products. Technically, the IVCL tool utilizes naïve bayes algorithm to classify the type of reactions and then applies genetic algorithm inspired approach to generate the products. Subsequently it employs system of equations method to balance the reactions. Experimental evaluations reveal that proposed IVCL tool runs with 95% accuracy.


Author(s):  
João Sousa Andrade ◽  
Artur M. Arsénio

Infectious diseases, such as the recent Ebola outbreak, can be especially dangerous for large communities on today's highly connected world. Countermeasures can be put in place if one is able to predict determine which people are more vulnerable to infections or have been in contact with the disease, and where. Contact location, time and relationship with the subject are relevant metrics that affect the probability of disease propagation. Sensors on personal devices that gather information from people, and social networks analysis, allow the integration of community data, while data analysis and modelling may potentially indicate community-level susceptibility to an epidemic. Indeed, there has been interest on social networks for epidemic prediction. But the integration between large-scale sensor networks and these initiatives, required to achieve epidemic prediction, is yet to be achieved. In this context, an opportunistic system is proposed and evaluated for predicting an epidemic outbreak in a community, while guaranteeing user privacy.


Sign in / Sign up

Export Citation Format

Share Document