scholarly journals ANALISIS KINERJA ISCSI TARGET PADA WIRELESS LAN MEMAKAI STANDAR LIO

SINERGI ◽  
2015 ◽  
Vol 19 (1) ◽  
pp. 25
Author(s):  
Rizal Bahaweres ◽  
Tjetjep Rony Budiman ◽  
Andi Adriansyah

Semakin banyaknya kebutuhan data center maupun laboratorium komputer di Indonesia dipengaruhi oleh semakin banyaknya pengguna yang memanfaatkan komputer baik untuk bisnis maupun pendidikan. Salah satu kebutuhan utama yang tidak bisa dilepaskan dari pemakaian komputer adalah tempat penyimpanan baik berupa USB Flash Disk, HD Eksternal, HD Internal sampai HD untuk kebutuhan skala besar untuk komputer server yang berada di data center, laboratorium atau jaringan komputer. Ruang penyimpanan data atau data storage semakin berkembang dengan munculnya teknologi komputer jaringan yang memunculkan alternatif data storage berupa DAS, NAS, FC, FcoE dan iSCSI. iSCSI menggunakan standard TCP/IP protocol over Ethernet untuk menyediakan penyimpanan berbasis block. Saat ini ada 2 jenis multiprotocol SCSI Target utama di industri yaitu LIO dan COMSTAR yang menggantikan teknologi sebelumnya yaitu iET, SCST dan STGT. LIO (linux-iscsi.org) merupakan standard open source iSCSI Target untuk berbagi ruang penyimpanan di Linux. LIO mendukung storage fabrics, yaitu Fibre Channel (QLogic), FCoE, iEEE 1394, iSCSI, iSER (Mellanox InfiniBand), SRP (Mellanox InfiniBand), USB, vHost, dan lain-lain.

Author(s):  
Tom Clark

Data storage is playing an increasingly visible role in securing application data in the data center. Today virtually all large enterprises and institutions worldwide have implemented networked storage infrastructures to provide high performance input/output (I/O) operations, high availability access, consolidation of storage assets, and data protection and archiving. Storage area networks (SANs) are typically based on Fibre Channel technology and are normally contained within the physical confines of the data center. The security of this physical isolation, however, has proven inadequate to safeguard data from inadvertent or malicious disruption. Both established and emerging Fibre Channel and IP standards are required to secure the storage infrastructure and protect data assets from corruption or misappropriation. This paper provides an overview of storage networking technology and the security mechanisms that have been developed to provide data integrity for data center storage infrastructures.


2021 ◽  
Vol 230 ◽  
pp. 110599
Author(s):  
Xu Han ◽  
Wei Tian ◽  
Jim VanGilder ◽  
Wangda Zuo ◽  
Cary Faulkner

Compiler ◽  
2015 ◽  
Vol 4 (2) ◽  
Author(s):  
Hero Wintolo ◽  
Lalu Septian Dwi Paradita

Cloud computing, one form of information technologies are widely used in the field of computer networks or the Internet. Cloud computing consists of computer hardware, computer networking devices, and computer software, the cloud computing there are three services provided include (SaaS) Software as a Service (PaaS) Platform as a Service, and (IaaS) Infrastructure as a Service. Application cloud computing services in the wake of this system is a service-based data storage infrastructure as a service by using android smartphone as a storage medium, which utilizes FTP Server which is already available on the smartphone. This certainly supports the easy storage of data that utilize various types of internal and external storage on smartphones that serves as a storage server. In addition to the functions of storage available, this service can accommodate streaming function .mp3 file type. Implementation result of the system can be implemented on a local network using a wireless LAN. In addition, the results of user testing using Likert method shows the application can run and function properly


Author(s):  
Ganesh Chandra Deka

NoSQL databases are designed to meet the huge data storage requirements of cloud computing and big data processing. NoSQL databases have lots of advanced features in addition to the conventional RDBMS features. Hence, the “NoSQL” databases are popularly known as “Not only SQL” databases. A variety of NoSQL databases having different features to deal with exponentially growing data-intensive applications are available with open source and proprietary option. This chapter discusses some of the popular NoSQL databases and their features on the light of CAP theorem.


Author(s):  
Sachin Arun Thanekar ◽  
K. Subrahmanyam ◽  
A.B. Bagwan

<p>Nowadays we all are surrounded by Big data. The term ‘Big Data’ itself indicates huge volume, high velocity, variety and veracity i.e. uncertainty of data which gave rise to new difficulties and challenges. Hadoop is a framework which can be used for tremendous data storage and faster processing. It is freely available, easy to use and implement. Big data forensic is one of the challenges of big data. For this it is very important to know the internal details of the Hadoop. Different files are generated by Hadoop during its process. Same can be used for forensics. In our paper our focus is on digital forensics and different files generated during different processes. We have given the short description on different files generated in Hadoop. With the help of an open source tool ‘Autopsy’ we demonstrated that how we can perform digital forensics using automated tool and thus big data forensics can be done efficiently.</p>


2014 ◽  
Vol 53 (03) ◽  
pp. 202-207 ◽  
Author(s):  
M. Haag ◽  
L. R. Pilz ◽  
D. Schrimpf

SummaryBackground: Clinical trials (CT) are in a wider sense experiments to prove and establish clinical benefit of treatments. Nowadays electronic data capture systems (EDCS) are used more often bringing a better data management and higher data quality into clinical practice. Also electronic systems for the randomization are used to assign the patients to the treatments.Objectives: If the mentioned randomization system (RS) and EDCS are used, possibly identical data are collected in both, especially by stratified randomization. This separated data storage may lead to data inconsistency and in general data samples have to be aligned. The article discusses solutions to combine RS and EDCS. In detail one approach is realized and introduced.Methods: Different possible settings of combination of EDCS and RS are determined and the pros and cons for each solution are worked out. For the combination of two independent applications the necessary interfaces for the communication are defined. Thereby, existing standards are considered. An example realization is implemented with the help of open-source applications and state-of-the-art software development procedures.Results: Three possibilities of separate usage or combination of EDCS and RS are pre -sented and assessed: i) the complete independent usage of both systems; ii) realization of one system with both functions; and iii) two separate systems, which communicate via defined interfaces. In addition a realization of our preferred approach, the combination of both systems, is introduced using the open source tools RANDI2 and Open-Clinica.Conclusion: The advantage of a flexible independent development of EDCS and RS is shown based on the fact that these tool are very different featured. In our opinion the combination of both systems via defined interfaces fulfills the requirements of randomization and electronic data capture and is feasible in practice. In addition, the use of such a setting can reduce the training costs and the error-prone duplicated data entry.


2016 ◽  
Vol 12 (06) ◽  
pp. 4 ◽  
Author(s):  
Irfan Syamsuddin

The paper report the applicablity of open source simulation software called GreenCloud to assist a novel Problem Based Learning in a laboratory environment. The actual case of Indonesia government plan to deploy cloud based data center infrastructure was picked up as the actual case. In such case, cloud economics analysis is required along with technical one. An open source software called GreenCloud is suitebale to perform the simulation and analysis of cloud computing from economics perspective. It was applied into three models of cloud architecture namely Two-Tier, Three-Tier and Three-Tier High Speed and then analyzed in terms of their energy consumptions based on three options of cloud economics scheme, namely i) non energy savings, ii) Dynamics Voltage and Frequency Scaling (DVFS) and iii) Dynamics Shutdown (DNS).


2018 ◽  
Vol 164 ◽  
pp. 01019
Author(s):  
Jason Reynaldo ◽  
David Boy Tonara

Data mining is an important research domain that currently focused on knowledge discovery database. Where data from the database are mined so that information can be generated and used effectively and efficiently by humans. Mining can be applied to the market analysis. Association Rule Mining (ARM) has become the core of data mining. The search space is exponential in the number of database attributes and with millions of database objects the problem of I/O minimization becomes paramount. To get the information and the data such as, observation of the master data storage systems and interviews were done. Then, ECLAT algorithm is applied to the open-source library SPMF. In this project, this application can perform data mining assisted by open source SPMF with determined writing format of transaction data. It successfully displayed data with 100 % success rate. The application can generate a new easier knowledge which can be used for marketing the product.


2008 ◽  
Vol 9 (1) ◽  
Author(s):  
Romesh Stanislaus ◽  
John M Arthur ◽  
Balaji Rajagopalan ◽  
Rick Moerschell ◽  
Brian McGlothlen ◽  
...  
Keyword(s):  

2017 ◽  
Vol 27 (4) ◽  
Author(s):  
Hassan Hadi Saleh

The security of data storage in “cloud” is big challenge because the data keep within resources that may be accessed by particular machines. The managing of these data and services may not be high reliable. Therefore, the security of data is highly challenging. To increase the security of data in data center of cloud, we have introduced good method to ensure data security in “cloud computing” by methods of data hiding using color images which is called steganography. The fundamental objective of this paper is to prevent "Data Access” by unauthorized or opponent users. This scheme stores data at data centers within edges of color images and retrieves data from it when it is wanted.


Sign in / Sign up

Export Citation Format

Share Document