International Journal of Scientific Research in Computer Science Engineering and Information Technology
Latest Publications





Published By Technoscience Academy


Kiran Khandarkar ◽  
Dr. Sharvari Tamne

The research provides a method for improving change detection in SAR images using a fusion object and a supervised classification system. To remove noise from the input image, we use the DnCNN denoising approach. The data from the first image is then processed with the mean ratio operator. The log ratio operator is used to process the second image. These two images are fused together using SWT-based image fusion, and the output is sent to a supervise classifier for change detection.

Suhas S ◽  
Dr. C. R. Venugopal

An enhanced classification system for classification of MR images using association of kernels with support vector machine is developed and presented in this paper along with the design and development of content-based image retrieval (CBIR) system. Content of image retrieval is the process of finding relevant image from large collection of image database using visual queries. Medical images have led to growth in large image collection. Oriented Rician Noise Reduction Anisotropic Diffusion filter is used for image denoising. A modified hybrid Otsu algorithm termed is used for image segmentation. The texture features are extracted using GLCM method. Genetic algorithm with Joint entropy is adopted for feature selection. The classification is done by support vector machine along with various kernels and the performance is validated. A classification accuracy of 98.83% is obtained using SVM with GRBF kernel. Various features have been extracted and these features are used to classify MR images into five different categories. Performance of the MC-SVM classifier is compared with different kernel functions. From the analysis and performance measures like classification accuracy, it is inferred that the brain and spinal cord MRI classification is best done using MC- SVM with Gaussian RBF kernel function than linear and polynomial kernel functions. The proposed system can provide best classification performance with high accuracy and low error rate.

Waheed Muhammad SANYA ◽  
Gaurav BAJPAI ◽  
Haji Ali HAJI

Vision relieves humans to understand the environmental deviations over a period. These deviations are seen by capturing the images. The digital image plays a dynamic role in everyday life. One of the processes of optimizing the details of an image whilst removing the random noise is image denoising. It is a well-explored research topic in the field of image processing. In the past, the progress made in image denoising has advanced from the improved modeling of digital images. Hence, the major challenges of the image process denoising algorithm is to advance the visual appearance whilst preserving the other details of the real image. Significant research today focuses on wavelet-based denoising methods. This research paper presents a new approach to understand the Sobel imaging process algorithm on the Linux platform and develop an effective algorithm by using different optimization techniques on SABRE i.MX_6. Our work concentrated more on the image process algorithm optimization. By using the OpenCV environment, this paper is intended to simulate a Salt and Pepper noisy phenomenon and remove the noisy pixels by using Median Filter Algorithm. The Sobel convolution method included and used in the design of a Sobel Filter and then process the image following the median filter, to achieve an effective edge detection result. Finally, this paper optimizes the algorithm on SABRE i.MX_6 Linux environment. By using algorithmic optimization (lower complexity algorithm in the mathematical sense, using appropriate data structures), optimization for RISC (loop unrolling) processors, including optimization for efficient use of hardware resources (access to data, cache management and multi-thread), this paper analyzed the different response parameters of the system with varied inputs, different compiler options (O1, O2, or O3), and different doping degrees. The proposed denoising algorithm shows the meaningful addition of the visual quality of the images and the algorithmic optimization assessment.

Lokesh S ◽  
Jayasri B. S

A Cross Layered framework is an important concept in today’s world given the abundant usage of both single-path and multi path wireless network architectures. One of the important design issues in the development of a robust framework such as this is the design of an Optimization Agent or an OA. In recent days of wireless and wired ad-hoc networks, cross-layer design was brought about a few years back to explore attached optimization at different layers. In order to describe solutions in cross-layered design, the Open System Intercommunications model was employed. However, it is clear that no voice and reference mechanism exists to aid optimization, which could effectively halt effective adaptability and deployment of cross-layered solutions. In this study, we suggest some hypotheses regarding how to model and create cross-layer solutions using the OSI layered method. We use the aforementioned method to analyse and simulate a particular type of cross-layered solution, namely energy-aware routing protocols. We use a layered approach to examine two proposals that are accessible in the literature. The applied strategy leads to the creation of an energy- aware, one-of-a-kind solution that outperforms prior versions and provides interesting and clear insights into the function that each layer plays in the overall optimization process. The network throughput, utilization, and reliability have all increased practically rapidly in the last few years. With the emergence of broadband wireless and wired cellular networks, as well as mobile adhoc networks (MANETs) and improved computational capacity, a new generation of apps, especially real-time multimedia applications, has emerged. Delivering real-time multimedia traffic across a sophisticated network like the Internet could be a particularly difficult undertaking, as these applications have stringent bandwidth and other quality-of-service (QoS) requirements.

Sagarmoy Ganguly ◽  
Asoke Nath

Quantum cryptography is a comparatively new and special type of cryptography which uses Quantum mechanics to provide unreal protection of data/information and unconditionally secure communications. This is achieved with Quantum Key Distribution (QKD) protocols which is a representation of an essential practical application of Quantum Computation. In this paper the authors will venture the concept of QKD by reviewinghow QKD works, the authors shall take a look at few protocols of QKD, followed by a practical example of Quantum Cryptography using QKD and certain limitations from the perspective of Computer Science in specific and Quantum Physics in general.

Shreyashi Chowdhury ◽  
Asoke Nath

Natural language processing (NLP) is a subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between computers and human language, in particular how to program computers to process and analyse large amounts of natural language data. The goal is a computer capable of "understanding" the contents of documents, including the contextual nuances of the language within them.NLP combines computational linguistics—rule-based modelling of human language—with statistical, machine learning, and deep learning models. Together, these technologies enable computers to process human language in the form of text or voice data and to ‘understand’ its full meaning, complete with the speaker or writer’s intent and sentiment. Challenges in natural language processing frequently involve speech recognition, natural language understanding, and natural language generation. This paper discusses on the various scope and challenges , current trends and future scopes of Natural Language Processing.

Ropa Roy ◽  
Asoke Nath

A quantum gate or quantum logic gate is an elementary quantum circuit working on a small number of qubits. It means that quantum gates can grasp two primary feature of quantum mechanics that are entirely out of reach for classical gates : superposition and entanglement. In simpler words quantum gates are reversible. In classical computing sets of logic gates are connected to construct digital circuits. Similarly, quantum logic gates operates on input states that are generally in superposition states to compute the output. In this paper the authors will discuss in detail what is single and multiple qubit gates and scope and challenges in quantum gates.

Rajarshi SinhaRoy

In this digital era, Natural language Processing is not just a computational process rather it is a way to communicate with machines as humanlike. It has been used in several fields from smart artificial assistants to health or emotion analyzers. Imagine a digital era without Natural language processing is something which we cannot even think of. In Natural language Processing, firstly it reads the information given and after that begins making sense of the information. After the data has been properly processed, the real steps are taken by the machine throwing some responses or completing the work. In this paper, I review the journey of natural language processing from the late 1940s to the present. This paper also contains several salient and most important works in this timeline which leads us to where we currently stand in this field. The review separates four eras in the history of Natural language Processing, each marked by a focus on machine translation, artificial intelligence impact, the adoption of a logico-grammatical style, and an attack on huge linguistic data. This paper helps to understand the historical aspects of Natural language processing and also inspires others to work and research in this domain.

Pinjari Vali Basha

<p>By rapid transformation of technology, huge amount of data (structured data and Un Structured data) is generated every day.  With the aid of 5G technology and IoT the data generated and processed every day is very large. If we dig deeper the data generated approximately 2.5 quintillion bytes.<br> This data (Big Data) is stored and processed with the help of Hadoop framework. Hadoop framework has two phases for storing and retrieve the data in the network.</p> <ul> <li>Hadoop Distributed file System (HDFS)</li> <li>Map Reduce algorithm</li> </ul> <p>In the native Hadoop framework, there are some limitations for Map Reduce algorithm. If the same job is repeated again then we have to wait for the results to carry out all the steps in the native Hadoop. This led to wastage of time, resources.  If we improve the capabilities of Name node i.e., maintain Common Job Block Table (CJBT) at Name node will improve the performance. By employing Common Job Block Table will improve the performance by compromising the cost to maintain Common Job Block Table.<br> Common Job Block Table contains the meta data of files which are repeated again. This will avoid re computations, a smaller number of computations, resource saving and faster processing. The size of Common Job Block Table will keep on increasing, there should be some limit on the size of the table by employing algorithm to keep track of the jobs. The optimal Common Job Block table is derived by employing optimal algorithm at Name node.</p>

Sign in / Sign up

Export Citation Format

Share Document