scholarly journals DATABASE – WEB INTERFACE VULNERABILITIES

2021 ◽  
Vol 17 (1) ◽  
pp. 279-287
Author(s):  
Dorin IORDACHE

The importance of information security in general, of managed information at the level of a database has increased with the expansion of the Internet. On the other hand, it has acquired new facets with the increase of the accessibility of the users to as many resources as possible. Large volume of private data in use and the limitation of unauthorized actions to information have brought new aspects to the issue of ensuring their protection. The scope of this field is wide and allows the operation in several directions: identification, description, creation, implementation and testing of mechanisms aimed at improving the working environment in which database management systems operates. Due to the importance of the information managed by a DBMS[1], it is necessary to define a framework safe and easy to use. The database fulfills not only the role of storage, but also of data provider to users. Thus, the information must be protected throughout the interaction process: generation, storage, processing, modification, deletion, etc. Therefore, the security of databases must not only be reduced to the protection of certain data considered sensitive, but also to the creation of a secure, authorized and controlled global environment through which information becomes available to users.   [1] DBMS – DataBase Management System

Author(s):  
Alberto Mendoza ◽  
Aristóteles Uribe ◽  
Claudia Z. Gil ◽  
Emilio Mayoral

Two years ago, the Mexican Transportation Institute began to develop a computer-based management system of the information collected by various organizations about accidents occurring on the Federal Road Network. This system combines the information gathered by these organizations with the purpose of completing and validating the data so that tools can be developed for processing and analyzing the validated data and the processed data and developed tools can be made available to users. It was decided to support the development of such efforts on computer databases already being generated, on database processing and management software, on geographic information systems, and on remote data-exchange systems (e.g., the Internet). The progress made so far in the development of the computer system is reviewed. The system has been named the “Relational Accident Database Management System for Mexican Federal Roads” (SAIACF, in Spanish). The information sources beneficial to this project are identified and analyzed. The ideal scheme conceived for the integration of the various information sources is presented, and the SAIACF system is outlined. Some of the results obtained after its application to the information corresponding to 1997 are shown. Also, the element that was generated to make the information and the tools available to users is described, and conclusions are drawn.


Author(s):  
Ismail Omar Hababeh ◽  
Muthu Ramachandran

Database technology has been a significant field to work in for developing real life applications in network information systems. An enterprise’s reliance on its network and database applications in Distributed Database Management systems (DDBMS) environment is likely to continue growing exponentially. In such a system the estimation and prediction of Quality of Service (QoS) performance improvements are crucial since it increases understanding the issues that affect the distributed database networking system behaviour; like database fragmentation, clustering database network sites, and data allocation and replication that would reduce the amount of irrelevant data and speed up the transactions response time. This chapter introduces the trends of database management systems DBMS and presents an integrated method for designing Distributed Relational networking Database Management System DRDBMS that efficiently and effectively achieves the objectives of database fragmentation, clustering database network sites, and fragments allocation and replication. It is based on high speed partitioning, clustering, and data allocation techniques that minimize the data fragments accessed and data transferred through the network sites, maximize the overall system throughput by increasing the degree of concurrent transactions processing of multiple fragments located in different sites, and result in better QoS design and decision support.


2000 ◽  
Vol 09 (01n02) ◽  
pp. 147-169
Author(s):  
PATRICK MARTIN ◽  
WENDY POWLEY ◽  
ANDREW WESTON ◽  
PETER ZION

In the not too distant past, the amount of online data available to general users was relatively small. Most of the online data was maintained in organizations' database management systems and accessible only through the interfaces provided by those systems. The popularity of the Internet, in particular, has meant that there is now an abundance of online data available to users in the form of Web pages and files. This data, however, is maintained in passive data sources, that is sources that do not provide facilities to search or query their data. The data must be queried and examined using applications such as browsers and search engines. In this paper, we explore an approach to querying passive data sources based on the extraction, and subsequent exploitation, of metadata from the data sources. We describe two situations in which this approach has been used, evaluate the approach and draw some general conclusions.


Author(s):  
George Garman

This paper discusses the issues that were involved in the development of two online database courses at The Metropolitan State College of Denver. These courses are CMS 3060 Database Management Systems and CMS 4060 Advanced Database Management Systems. In addition to the issues apparent in the development of regular courses, technology driven courses provide a special set of problems. This paper examines the integration of a remote relational database management system (Oracle) into an online course. Also, the paper discusses the use of the PC based Oracle Developer 2000 product in an online course.


2020 ◽  
Vol 21 (1) ◽  
pp. 1-2
Author(s):  
Swaminathan JN ◽  
Gopi Ram ◽  
Sureka Lanka

The evolution of Internet of Things has given way to a Smart World where there is an improved integration of devices, systems and processes in humans through all pervasive connectivity. Anytime, anywhere connection and transaction is the motto of the Internet of things which brings comfort to the users and sweeps the problem of physical boundary out of the way. Once it has come into the purview of developers, new areas have been identified and new applications have been introduced. Small wearables which can track your health to big automated vehicles which can move from one place to another self navigating without human intervention are the order of the day. This has also brought into existence a new technology called cloud, since with IoT comes a large number of devices connected to the internet continuously pumping data into the cloud for storage and processing. Another area benefited from the evolution of IoT is the wireless and wired connectivity through a wide range of connectivity standards. As with any technology, it has also created a lot of concerns regarding the security, privacy and ethics.   Data protection issues created by new technologies are a threat which has been recognized by developers, public and also the governing body long back. The complexity of the system arises because of the various sensors and technologies which clearly tell the pattern of the activities of the individual as well an organization making us threat prone. Moreover, the volume of the data in the cloud makes it too difficult to recognize the privacy requirement of the data or to segregate open data from private data. Data analytics is another technology which supposedly increases the opportunity of increasing business by studying this private data collected from IoT and exploring ways to monetize them. It also helps the individual by recognizing their priorities and narrowing their search. But the data collected are real world data and aggregation of this data in the cloud is an open invitation to the hackers to study about the behaviors of the individuals.   The special issues of Scalable Computing has attract related to the Role of Scalable Computing and Data Analytics in Evolution of Internet of Things has attracted 28 submissions from which were selected 12.    


1995 ◽  
Vol 81 (2) ◽  
pp. 355-364
Author(s):  
Elisabeth Tenvergert ◽  
Johannes Kingma ◽  
Henk J. Klasen

The interchange between different database management systems and statistical packages may be hampered by different formats for data structure. The programs FIXFREE and GENHDR convert database files into different formats: ASCII fixed format to comma-delimited free ASCII format and vice versa. The programs may be used either as stand alone programs or as procedures within a database management system. The programs also contain a procedure to import comma-delimited free ASCII formatted files into database management systems, e.g., DBase III, DBase IV, or the different versions of FOXPRO. The programs are written in TURBO PASCAL (Version 6.0).


GEOgraphia ◽  
2009 ◽  
Vol 4 (7) ◽  
pp. 65
Author(s):  
Gilberto Pessanha Ribeiro

RESUMO Um sistema gerenciador de banco de dados pode possuir uma extensão de um banco de dados voltada para o armazenamento e recuperação de metadados baseados em objetos abstratos. O objetivo deste trabalho é alcançar um entendimento melhor dos requisitos básicos para um gerenciador de metadados. operar satisfatoriamente, além de testar ao máximo um modelo de armazenamento avaliando o seu desempenho, diante dos diversos tipos de dados geográficos existentes. O trabalho aponta a necessidade de se conhecer melhor o desempenho de sistemas já desenvolvidos, ou em desenvolvimento, com objetivos de gerenciar metadados geográficos digitais. Palavras-chave: Sistemas de Informação Geográfica, Geoprocessamento, Geografia, Bancos de Dados, Computação.ABSTRACT Database management systems can be design with an extended database in order to store and retrieve metadata based on abstract objects. The target of this paper is a better understanding of basic requirements to this database management system operates satisfactorily, above all to test at best a model to store, analysing your performance, in front of several types of existing geographic data. The aim of this paper is to show some needs to know developed systems performance, or in developing, which purpose is digital geographic metadata management. Keywords: Geographic Information Systems, Geoprocessing, Geography, Databases, Computation.


2021 ◽  
Vol 1 (3) ◽  
pp. 100-105
Author(s):  
Muhammad Fakhimuddin ◽  
Uswatun Khasanah ◽  
Rini Trimiyati

Data management is part of information resource management which includes all activities that ensure that company data resources are accurate, up-to-date, and safe from tampering, and are also available to users and the company. This study aims to find out about the role of concept database management in managing large volumes of company data. The result showed that data management activities include data collection, integrity, and testing, storage, maintenance, security, organization, retrieval. Meanwhile, the database structure includes hierarchical database structure, network database structure, and relational database structure.


Author(s):  
Dionysios Politis

In this chapter data-mining techniques are presented that can be used to create data-profiles of individuals from anonymous data that can be found freely and abundantly in open environments, such as the Internet. Although such information takes in most cases the form of an approximation and not of a factual and solid representation of concrete personal data, nevertheless it takes advantage of the vast increase in the amount of data recorded by database management systems as well as by a number of archiving applications and repositories of multimedia files.


Sign in / Sign up

Export Citation Format

Share Document