Lahore Garrison University Research Journal of Computer Science and Information Technology
Latest Publications


TOTAL DOCUMENTS

17
(FIVE YEARS 17)

H-INDEX

0
(FIVE YEARS 0)

Published By Lahore Garrison University

2521-0122, 2519-7991

Author(s):  
Salman Qadri

The purpose of this study is to highlight the significance of machine vision for the Classification of kidney stone identification. A novel optimized fused texture features frame work was designed to identify the stones in kidney.  A fused 234 texture feature namely (GLCM, RLM and Histogram) feature set was acquired by each region of interest (ROI). It was observed that on each image 8 ROI’s of sizes (16x16, 20x20 and 22x22) were taken. It was difficult to handle a large feature space 280800 (1200x234). Now to overcome this data handling issue we have applied feature optimization technique namely POE+ACC and acquired 30 most optimized features set for each ROI. The optimized fused features data set 3600(1200x30) was used to four machine vision Classifiers that is Random Forest, MLP, j48 and Naïve Bayes. Finally, it was observed that Random Forest provides best results of 90% accuracy on ROI 22x22 among the above discussed deployed Classifiers


Author(s):  
Muhammad Waseem Iqbal

This article analyzes several User Interface (UI) designs and puts forward some more general design principles for interfaces designed for low-literate users. The results of this study highlight the importance of text-free interfaces compared to text-based interfaces for the illiterate and low-literate population. The study developed a Short Message Service (SMS) interface consisting of many design elements, including graphical icons, voice, and text reduction. The participants were more satisfied with the designed SMS interface as compared to the traditional text-based interface of SMS. We believe that if the user interface is appropriately designed, users will not need formal literacy, computer skills, or any external help to operate the application. It has been shown that an interface with minimal or no text but one or more graphics, audio, and digital components is helpful for users with low literacy rates.


Author(s):  
Sohaib Ahmad

The need to process and dealing with a vast amount of data is increasing with the developing technology. One of the leading promising technology is Cloud Computing, enabling one to accomplish desired goals, leading to performance enhancement. Cloud Computing comes into play with the debate on the growing requirements of data capabilities and storage capacities. Not every organization has the financial resources, infrastructure & human capital, but Cloud Computing offers an affordable infrastructure based on availability, scalability, and cost-efficiency. The Cloud can provide services to clients on-demand, making it the most adapted system for virtual storage, but still, it has some issues not adequately addressed and resolved. One of those issues is that load balancing is a primary challenge, and it is required to balance the traffic on every peer adequately rather than overloading an individual node. This paper provides an intelligent workload management algorithm, which systematically balances traffic and homogeneously allocates the load on every node & prevents overloading, and increases the response time for maximum performance enhancement.


Author(s):  
Rana Aamir Raza

In the area of fuzzy rough set theory (FRST), researchers have gained much interest in handling the high-dimensional data. Rough set theory (RST) is one of the important tools used to pre-process the data and helps to obtain a better predictive model, but in RST, the process of discretization may loss useful information. Therefore, fuzzy rough set theory contributes well with the real-valued data. In this paper, an efficient technique is presented based on Fuzzy rough set theory (FRST) to pre-process the large-scale data sets to increase the efficacy of the predictive model. Therefore, a fuzzy rough set-based feature selection (FRSFS) technique is associated with a Random weight neural network (RWNN) classifier to obtain the better generalization ability. Results on different dataset show that the proposed technique performs well and provides better speed and accuracy when compared by associating FRSFS with other machine learning classifiers (i.e., KNN, Naive Bayes, SVM, decision tree and backpropagation neural network).


Author(s):  
Muhammad Usman Ashraf

Cloud computing is one of the ruling storage solutions. However, the cloud computing centralized storage method is not stable. Blockchain, on the other hand, is a decentralized cloud storage system that ensures data security. Cloud environments are vulnerable to several attacks which compromise the basic confidentiality, integrity, availability, and security of the network. This research focus on decentralized, safe data storage, high data availability, and effective use of storage resources. To properly respond to the situation of the blockchain method, we have conducted a comprehensive survey of the most recent and promising blockchain state-of-the-art methods, the P2P network for data dissemination, hash functions for data authentication, and IPFS (InterPlanetary File System) protocol for data integrity. Furthermore, we have discussed a detailed comparison of consensus algorithms of Blockchain concerning security. Also, we have discussed the future of blockchain and cloud computing. The major focus of this study is to secure the data in Cloud computing using blockchain and ease for researchers for further research work.


Author(s):  
Muhammad Ejaz Sandhu

To test the behavior of the Linux kernel module, device drivers and file system in a faulty situation, scientists tried to inject faults in different artificial environments. Since the rarity and unpredictability of such events are pretty high, thus the localization and detection of Linux kernel, device drivers, file system modules errors become unfathomable. ‘Artificial introduction of some random faults during normal tests’ is the only known approach to such mystifying problems. A standard method for performing such experiments is to generate synthetic faults and study the effects. Various fault injection frameworks have been analyzed over the Linux kernel to simulate such detection. The following paper highlights the comparison of different approaches and techniques used for such fault injection to test Linux kernel modules that include simulating low resource conditions and detecting memory leaks. The frameworks chosen to be used in these experiments are; Linux Text Project (LTP), KEDR, Linux Fault-Injection (LFI), and SCSI. 


Author(s):  
Kausar Parveen

Now a day’s all organizations are moving towards digitalization. These consequences of the use of digital technologies made organizations seek for best and fast digital solutions. All software developer companies are also trying to draw consumer's attention by offering prompt services. In this regard, the critical issue in information technology and other areas of computation is how software can be created easily and rapidly for complex businesses. In this context, the main aim of the research is to show the agile methodology role in the rapid digital transformation. In this paper, we have surveyed different agile methodologies and tools for rapid software development and introduced an agile management tool having a backlog. We identified the key practices of agile methods and after a survey, it is suggested that the agile approach can help to achieve a balance between the applications generated by developers on customer demand. This paper illuminates and translates agile methodologies into agile project management tools for simple and rapid application development. Empirical research based on a case study is provided for better understanding and showing the importance of agility in software development


Author(s):  
Bilal Ahmad

The objective of this paper is to utilize deep learning technology to develop an intelligent digital twin for the operational support of a human-robot assembly station. Digital twin, as a virtual portrayal, is used to design, simulate, and optimize the complexity of the assembly system. For testing purposes, a convolutional neural network (CNN) is integrated with a digital twin. It is used for the application of a collaborative robot for an assembly application. Collaborative robots are a new form of industrial robots that are safe for humans and can work alongside humans and have received ample attraction in recent years for automation of simple to complex tasks.


Author(s):  
Qaiser Abbas

Information retrieval is acquiring particular information from large resources and presenting it according to the user’s need. The incredible increase in information resources on the Internet formulates the information retrieval procedure, a monotonous and complicated task for users. Due to over access of information, better methodology is required to retrieve the most appropriate information from different sources. The most important information retrieval methods include the probabilistic, fuzzy set, vector space, and boolean models. Each of these models usually are used for evaluating the connection between the question and the retrievable documents. These methods are based on the keyword and use lists of keywords to evaluate the information material. In this paper, we present a survey of these models so that their working methodology and limitations are discussed. This is an important understanding because it makes possible to select an information retrieval technique based on the basic requirements. The survey results showed that the existing model for knowledge recovery is somewhere short of what was planned. We have also discussed different areas of IR application where these models could be used.


Author(s):  
Asif Yaseen

With the swift increase of mobile devices such as personal digital assistants, smartphones and tablets, mobile commerce is broadly considered to be a driving force for the next wave of ecommerce. The power of mobile commerce is primarily due to the anytime-anywhere connectivity and the use of mobile technology, which creates enormous opportunities to attract and engage customers. Many believe that in an era of m-commerce especially in the telecommunication business retaining customers is a big challenge. In the face of an extremely competitive telecommunication industry, the value of acquiring new customers is very much expensive than retaining the existing customer. Therefore, it has become imperative to pay much attention to retaining the existing customers in order to get stabilized in a market comprised of vibrant service providers. In the current market, a number of prevailing statistical techniques for customer churn management are replaced by more machine learning and predictive analysis techniques. In this study, we employed the feature selection technique to identify the most influencing factors in customer churn prediction. We adopt the wrapper-based feature selection approach where Particle Swarm Optimization (PSO) is used for search purposes and different classifiers like Decision Tree (DT), Naïve Bayes, k-NN and Logistic regression is used for evaluation purposes to assess the enactment on optimally sampled and abridged dataset. Lastly, it is witnessed through simulations that our suggested method accomplishes fairly thriving for forecasting churners and hence could be advantageous for exponentially increasing competition in the telecommunication sector.


Sign in / Sign up

Export Citation Format

Share Document