Automatic Detection System of Web-Based Malware for Management-Type SaaS

2010 ◽  
Vol 129-131 ◽  
pp. 670-674
Author(s):  
Xu Jing ◽  
Dong Jian He ◽  
Lin Sen Zan ◽  
Jian Liang Li ◽  
Wang Yao

In management-type SaaS, user must be permitted to submit tenant’s business data on the SP's server, which may be embedded by the web-based malware. In this paper, we propose the automatic detecting method of web-based malware based on behavior analysis, which can make sure to meet the SLA by detecting the web-based malware actively. First, tenant’s update is downloaded to the bastion host by the web crawler. Second, it detect the behavior that tenant’s update is opened by IE. In order to break the malicious behavior during detecting, the IE has been injected in the DLL. Last, if the sensitive operations happen, the URL is appended to the malicious address database, and at same time the system administrator is informed by the SMS. The result of test is shown that our method can detect the web-based malware accurately. It helps to improve the service level of the management-type SaaS.

CCIT Journal ◽  
2019 ◽  
Vol 12 (1) ◽  
pp. 85-96
Author(s):  
Asmah Akhriana ◽  
Andi Irmayana

Along with the current development of Information Technology is always changing to make the security of an information is very important, especially on a network connected to the internet. But what is unfortunate is that the imbalance between each development of a technology is not accompanied by developments in the security system itself, so that there are quite a lot of systems that are still weak and have to be increased by the security wall. This study aims to design a Web-based App interface to facilitate users or administrators in securing network computers from various types of attacks. The Instrusion detection system (IDS) method is used to detect suspicious activity in a system or network using snort and honeypot. Honeypot is built on a computer along with Apache, MySQL, and Snort. Honeypot will act as a target to attract attackers and log information from the attacker and snort to apply the rules made from the web. The functional system will then be tested using the black box testing method. The results of this study concluded that Web App-based interfaces that are created can be used to help users and administrators in maintaining data and information on server computers from various types of attacks on computer networks


We present in this paper an integrated framework for collection and analysis of Facet-based text data. The integrated framework consists of four components: (1) user interface, (2) web crawler, (3) data analyzer, and (4) database (DB). User interface is used to set input Facet and option values for web crawling and text data analysis using a graphical user interface (GUI). In fact, it offers outcomes of research by data visualization. The web crawler collects text data from articles posted on the web based on input Facets. The data analyzer classifies papers in "relevant articles" (i.e., word sets to be included on such posts) and "nonrelevant articles" with predefined information. It then analyzes the text data of the relevant articles and visualizes the results of the data analysis. Ultimately, the DB holds the generated text information, the predefined user-defined expertise and the outcomes of data analysis and data visualization. We verify the feasibility of an integrated framework by means of proof of concept (PoC) prototyping. The experimental results show that the implemented prototype reliably collects and analyzes the text data of the articles.


2021 ◽  
Vol 8 ◽  
Author(s):  
Ali Al-Hemoud ◽  
Manar AlSaraf ◽  
Mariam Malak ◽  
Musab Al-Shatti ◽  
Meshael Al-Jarba ◽  
...  

This study aimed at the development of an analytic web-based system for the assessment of animal health in Kuwait. The data sources were based on the World Organization for Animal Health (OIE) and the World Animal Health Information System (WAHIS) repository with data gathered for the period (2005–2020). An on-line web-based system using TABLEAU Creator was developed for monitoring and surveillance of animal disease outbreaks. Five animal diseases were identified in Kuwait; namely, HPAI, FMD, glanders, LSD and MERS-CoV. The highest numbers of outbreaks were recorded for HPAI, followed by FMD. Examples of spatio-temporal visualizations of the web based mappings are presented and include disease cases, number of outbreaks and farm locations, among other features. The web-based system can serve as a monitoring tool to easily display the status of animal health in Kuwait. It can also serve to quickly identify and track disease outbreaks and monitor the spread patterns of new or emerging animal diseases between neighboring countries.


2020 ◽  
Vol 10 (11) ◽  
pp. 3837
Author(s):  
Julio Hernandez ◽  
Heidy M. Marin-Castro ◽  
Miguel Morales-Sandoval

The Web has become the main source of information in the digital world, expanding to heterogeneous domains and continuously growing. By means of a search engine, users can systematically search over the web for particular information based on a text query, on the basis of a domain-unaware web search tool that maintains real-time information. One type of web search tool is the semantic focused web crawler (SFWC); it exploits the semantics of the Web based on some ontology heuristics to determine which web pages belong to the domain defined by the query. An SFWC is highly dependent on the ontological resource, which is created by domain human experts. This work presents a novel SFWC based on a generic knowledge representation schema to model the crawler’s domain, thus reducing the complexity and cost of constructing a more formal representation as the case when using ontologies. Furthermore, a similarity measure based on the combination of the inverse document frequency (IDF) metric, standard deviation, and the arithmetic mean is proposed for the SFWC. This measure filters web page contents in accordance with the domain of interest during the crawling task. A set of experiments were run over the domains of computer science, politics, and diabetes to validate and evaluate the proposed novel crawler. The quantitative (harvest ratio) and qualitative (Fleiss’ kappa) evaluations demonstrate the suitability of the proposed SFWC to crawl the Web using a knowledge representation schema instead of a domain ontology.


2019 ◽  
Vol 11 (1) ◽  
pp. 8-19
Author(s):  
Crystal Jelita Lumban Tobing

 KPPN Medan II is one of the government organization units at the Ministry of Finance. Where leaders and employees who work at KPPN Medan II always carry out official trips between cities and outside the city. With these conditions, making SPPD documents experiencing the intensity of official travel activities carried out by employees of KPPN Medan II can be said frequently. So that in making SPPD in KPPN Medan II is still using the manual method that is recording through Microsoft Word which in the sense is less effective and efficient. In naming employees who get official assignments, officers manually entering employee data that receives official travel letters are prone to being lost because data is manually written. The web-based SPPD application is built by applying this prototyping method which is expected to facilitate SPPD KPPN Medan II management officers in making SPPD that is effective, efficient, accurate, time-saving, and not prone to losing SPPD data of KPPN Medan II employees who will has made official trips due to the existence of a special database to accommodate all SPPD files.


Sensi Journal ◽  
2020 ◽  
Vol 6 (2) ◽  
pp. 236-246
Author(s):  
Ilamsyah Ilamsyah ◽  
Yulianto Yulianto ◽  
Tri Vita Febriani

The right and appropriate system of receiving and transferring goods is needed by the company. In the process of receiving and transferring goods from the central warehouse to the branch warehouse at PDAM Tirta Kerta Raharja, Tangerang Regency, which is currently done manually is still ineffective and inaccurate because the Head of Subdivision uses receipt documents, namely PPBP and mutation of goods, namely MPPW in the form of paper as a submission media. The Head of Subdivision enters the data of receipt and mutation of goods manually and requires a relatively long time because at the time of demand for the transfer of goods the Head of Subdivision must check the inventory of goods in the central warehouse first. Therefore, it is necessary to hold a design of information systems for the receipt and transfer of goods from the central warehouse to a web-based branch warehouse that is already database so that it is more effective, efficient and accurate. With the web-based system of receiving and transferring goods that are already datatabed, it can facilitate the Head of Subdivision in inputing data on the receipt and transfer of goods and control of stock inventory so that the Sub Head of Subdivision can do it periodically to make it more effective, efficient and accurate. The method of data collection is done by observing, interviewing and studying literature from various previous studies, while the system analysis method uses the Waterfall method which aims to solve a problem and uses design methods with visual modeling that is object oriented with UML while programming using PHP and MySQL as a database.


Sign in / Sign up

Export Citation Format

Share Document