Recent Patents on Computer Science
Latest Publications


TOTAL DOCUMENTS

497
(FIVE YEARS 157)

H-INDEX

10
(FIVE YEARS 5)

Published By Bentham Science

2213-2759

Author(s):  
M. Rajeev Kumar ◽  
K. Arthi

: Recently, segmentation of iris image is the most important process in a robust iris recognition system due to the images captured from non-cooperative environments which introduce occlusions, blur, specular reflections, and off-axis. However, several techniques are developed to overcome these drawbacks in the iris segmentation process; it is still a challenging task to localize the iris texture regions. In this research, an effective two-stage of iris segmentation technique is proposed in a non-cooperative environment. At first, modified Geodesic Active Contour-based level set segmentation with Particle Swarm Optimization (PSO) is employed for iris segmentation. In this, the PSO algorithm is used to minimize the energy of the gradient descent equation in a region-based level set segmentation algorithm. Then, the global threshold-based segmentation is employed for pupil region segmentation. The experiment considered two well-known databases such as UBIRIS.V1 and UBIRIS.V2. The simulation outcomes demonstrate that the proposed novel approach attained more accurate and robust iris segmentation under non-cooperative conditions. Also, the results of the modified Geodesic Active Contour-based level set segmentation with the PSO algorithm attained better results than the conventional segmentation techniques.


Author(s):  
Md Shah Fahad ◽  
Shruti Gupta ◽  
Abhinav ◽  
Shreya Singh ◽  
Akshay Deepak

Background: Emotional speech synthesis is the process of synthesising emotions in a neutral speech – potentially generated by a text-to-speech system – to make an artificial human-machine interaction human-like. It typically involves analysis and modification of speech parameters. Existing work on speech synthesis involving modification of prosody parameters does so at sentence, word, and syllable level. However, further fine-grained modification at vowel level has not been explored yet, thereby motivating our work. Objective: To explore prosody parameters at vowel level for emotion synthesis. Method: Our work modifies prosody features (duration, pitch, and intensity) for emotion synthesis. Specifically, it modifies the duration parameter of vowel-like and pause regions and the pitch and intensity parameters of only vowel-like regions. The modification is gender specific using emotional speech templates stored in a database and done using pitch synchronous overlap and add (PSOLA) method. Result: Comparison was done with the existing work on prosody modification at sentence, word and syllable label on IITKGP-SEHSC database. Improvements of 8.14%, 13.56%, and 2.80% for emotions angry, happy, and fear respectively were obtained for the relative mean opinion score. This was due to: (1) prosody modification at vowel-level being more fine-grained than sentence, word, or syllable level and (2) prosody patterns not being generated for consonant regions because vocal cords do not vibrate during consonant production. Conclusion: Our proposed work shows that an emotional speech generated using prosody modification at vowel-level is more convincible than prosody modification at sentence, word and syllable level.


Author(s):  
Ahmed K. Jameil ◽  
Yasir Amer Abbas ◽  
Saad Al-Azawi

Background: The designed circuits are tested for faults detection in fabrication to determine which devices are defective. The design verification is performed to ensure that the circuit performs the required functions after manufacturing. Design verification is regarded as a test form in both sequential and combinational circuits. The analysis of sequential circuits test is more difficult than in the combinational circuit test. However, algorithms can be used to test any type of sequential circuit regardless of its composition. An important sequential circuit is the finite impulse response (FIR) filters that are widely used in digital signal processing applications. Objective: This paper presented a new design under test (DUT) algorithm for 4-and 8-tap FIR filters. Also, the FIR filter and the proposed DUT algorithm is implemented using field programmable gate arrays (FPGA). Method: The proposed test generation algorithm is implemented in VHDL using Xilinx ISE V14.5 design suite and verified by simulation. The test generation algorithm used FIR filtering redundant faults to obtain a set of target faults for DUT. The fault simulation is used in DUT to assess the benefit of test pattern in fault coverage. Results: The proposed technique provides average reductions of 20 % and 38.8 % in time delay with 57.39 % and 75 % reductions in power consumption and 28.89 % and 28.89 % slices reductions for 4- and 8-tap FIR filter, respectively compared to similar techniques. Conclusions: The results of implementation proved that a high speed and low power consumption design can be achieved. Further, the speed of the proposed architecture is faster than that of existing techniques.


Author(s):  
Alka Agrawal ◽  
Vishal Goyal ◽  
Puneet Mishra

Background: Robotic manipulator system has been useful in many areas like chemical industries, automobile, medical fields etc. Therefore, it is essential to implement a controller for controlling the end position of a robotic armeffectively. However, with the increasing non-linearity and the complexities of a robotic manipulator system, a conventional Proportional-Integral-Derivative controller has become ineffective. Nowadays, intelligent techniques like fuzzy logic, neural network and optimization algorithms has emerged as an efficient tool for controlling the highly complex non-linear functions with uncertain dynamics. Objective: To implement an efficient and robustcontroller using Fuzzy Logic to effectively control the end position of Single link Robotic Manipulator to follow the desired trajectory. Methods: In this paper, a Fuzzy Proportional-Integral-Derivativecontroller is implemented whose parameters are obtainedwith the Spider Monkey Optimization technique taking Integral of Absolute Error as an objective function. Results: Simulated results ofoutput of the plants controlled byFuzzy Proportional-Integral-Derivative controller have been shown in this paper and the superiority of the implemented controller has also been described by comparing itwith the conventional Proportional-Integral-Derivative controller and Genetic Algorithm optimization technique. Conclusion: From results, it is clear that the FuzzyProportional-Integral-Derivativeoptimized with the Spider monkey optimization technique is more accurate, fast and robust as compared to the Proportional-Integral-Derivativecontroller as well as the controllers optimized with the Genetic algorithm techniques.Also, by comparing the integral absolute error values of all the controllers, it has been found that the controller optimized with the Spider Monkey Optimization technique shows 99% better efficacy than the genetic algorithm technique.


Author(s):  
Manoj Kumar Srivastava ◽  
Rajesh Kumar ◽  
Ashish Khare

Background: Advances in Mobile and Internet technology evolved several online applications like smart class, virtual class and online classes. Online courseware influences better subjective knowledge of the learners. The effectiveness of processes of teaching and learning must evaluated for the benefits of the learners to select the best approach of learning which motivated us to evaluate and compare different Online Learning courses Effectiveness through statistical approaches. Objective: The main objective of this paper is to compare the learning effect of National Program on Technology Enhanced Learning (NPTEL) with traditional class room learning approach. Method: Master of Science -Final year Computer Science students has been allowed to learn their subjects in online learning mode using with NPTEL and traditional learning approach in two different groups. After learning of the subjects a series of tests has been conducted and their marks are recorded for comparison of two different learning modes For comparison of the results of two learning methodologies two different measuring statistical matrices namely F-test and T-test has been taken. The experimental results demonstrate thatthe t-test results of NPTEL and the f-test results for NPTEL learning method are superior than the other comparative learning methods. Results: The test shows that online learning approach provides better learning as compared to traditional classroom learning. Conclusion: The obtained results also indicate that there is a significant improvement on learners through NPTEL video lectures over traditional class room based learning.


Author(s):  
Gaurav Kumar Nigam ◽  
Chetna Dabas

Background & Objective: Wireless sensor networks are made up of huge amount of less powered small sensor nodes that can audit the surroundings, collect meaningful data, and send it base station. Various energy management plans that pursue to lengthen the endurance of overall network has been proposed over the years, but energy conservation remains the major challenge as the sensor nodes have finite battery and low computational capabilities. Cluster based routing is the most fitting system to help for burden adjusting, adaptation to internal failure, and solid correspondence to draw out execution parameters of wireless sensor network. Low energy adaptive clustering hierarchy is an efficient clustering based hierarchical protocol that is used to enhance the lifetime of sensor nodes in wireless sensor network. It has some basic flaws that need to be overwhelmed in order to reduce the energy utilization and inflating the nodes lifetime. Methods : In this paper, an effective auxiliary cluster head selection is used to propose a new enhanced GC-LEACH algorithm in order to minimize the energy utilization and prolonged the lifespan of wireless sensor network. Results & Conclusion: Simulation is performed in NS-2 and the outcomes show that the GC-LEACH outperforms conventional LEACH and its existing versions in the context of frequent cluster head rotation in various rounds, number of data packets collected at base station, as well as reduces the energy consumption 14% - 19% and prolongs the system lifetime 8% - 15%.


Author(s):  
Utkarsh Saxena ◽  
J.S Sodhi ◽  
Yaduveer Singh

: Since the end of 2000, there are lot of revolutions occurs in the field of Internet of Things (IoT), that affect tremendously on the world internet infrastructure. Smart Home is a dwelling which incorporates the key electrical appliances of a home connected to each other in a network, so that it can be easily accessed through remote device. The complexity of a smart home lies in the fact that it comprises of many heterogeneous networks which works simultaneously in order to achieve common task. Since each and every network has some sort of vulnerability associated with it, the same lies with Smart home network. Each of the layers of a smart home architecture is associated with some vulnerability. These Vulnerability could be dangerous and can exploit the network if not properly handled. This Paper discussed a secure framework based on Token Sharing mechanism using Squid Authentication for Access Control in a Smart Home Networks.


Author(s):  
Mohammad Abu Kausar ◽  
Mohammad Nasar

Background: Nowadays, the digital world is rising rapidly and becoming very difficult in nature's quantity, diversity, and speed. Recently, there have been two major changes in data management, which are NoSQL databases and Big Data Analytics. While evolving with the diverse reasons, their independent growths balance each other and their convergence would greatly benefit organization to make decisions on-time with the amount of multifaceted data sets that might be semi structured, structured, and unstructured. Though several software solutions have come out to support Big Data analytics on the one hand, on the other hand, there have been several packages of NoSQL database available in the market. Methods: The main goal of this article is to give comprehension of their perspective and a complete study to associate the future of the emerging several important NoSQL data models. Results: Evaluating NoSQL databases for Big Data analytics with traditional SQL performance shows that NoSQL database is a superior alternative for industry condition need high-performance analytics, adaptability, simplicity, and distributed large data scalability. Conclusion: This paper conclude with industry's current adoption status of NoSQL databases.


Author(s):  
Gurpreet Singh ◽  
Manish Mahajan ◽  
Rajni Mohana

BACKGROUND: Cloud computing is considered as an on-demand service resource with the applications towards data center on pay per user basis. For allocating the resources appropriately for the satisfaction of user needs, an effective and reliable resource allocation method is required. Because of the enhanced user demand, the allocation of resources has now considered as a complex and challenging task when a physical machine is overloaded, Virtual Machines share its load by utilizing the physical machine resources. Previous studies lack in energy consumption and time management while keeping the Virtual Machine at the different server in turned on state. AIM AND OBJECTIVE: The main aim of this research work is to propose an effective resource allocation scheme for allocating the Virtual Machine from an ad hoc sub server with Virtual Machines. EXECUTION MODEL: The execution of the research has been carried out into two sections, initially, the location of Virtual Machines and Physical Machine with the server has been taken place and subsequently, the cross-validation of allocation is addressed. For the sorting of Virtual Machines, Modified Best Fit Decreasing algorithm is used and Multi-Machine Job Scheduling is used while the placement process of jobs to an appropriate host. Artificial Neural Network as a classifier, has allocated jobs to the hosts. Measures, viz. Service Level Agreement violation and energy consumption are considered and fruitful results have been obtained with a 37.7 of reduction in energy consumption and 15% improvement in Service Level Agreement violation.


Author(s):  
Arpita Sarkar ◽  
Binod K. Singh

: Biometrics is a universally used automatic identification of persons, depending on their behavioural and biological quality. Biometric authentication plays a crucial role in the identity management system. Safety of biometric authentication systems needs to be addressed as there are still some points related to the integrity and public receiving of biometric systems. Feature extraction module of biometric authentication systems, during the season of enrolment scanned biometric information to extract the set of salient features. This set of unique feature is known as biometric template. This biometric template is helpful in distinguishing between different users. These templates are generally stored during the time of enrolment in a database indexed by the user's identity information. The biometric template protection is an important issue because the compromised template cannot be cancelled and reissued like a password or token. Template protection is a challenging task because of intra user variability present in the acquired biometric traits. This paper surveys about different existing approaches for designing a protection scheme for biometric templates with their strength and boundaries existing in the literature. Some prospect information in designing template protection schemes are also addressed in this paper.


Sign in / Sign up

Export Citation Format

Share Document