scholarly journals dirHub: a trackHub configurator with directory structure projection

2018 ◽  
Author(s):  
Hideya Kawaji

AbstractSummaryTrack Data Hub is a mechanism enabling us to visualize genomics data as tracks along genome coordinates and share them over the Internet, relying on a web server hosting data files and genome browsers offering graphical representations. It requires an accessible configuration file specifying all graphical parameters and track hierarchy, in addition to the data files. Here dirHub is developed to assist generation of the configuration file by projection of a file directory structure, which makes it possible to set up trackHub visualization mostly by file operations.Availability and implementationIt is implemented in ruby and the source code is available at https://github.com/hkawaji/dirHub/. It is tested on the UCSC Genome Browser and the Hub Track Database Definition (v2).

2013 ◽  
Vol 1 (2) ◽  
pp. 28
Author(s):  
Dite Ardian ◽  
Adian Fatchur Rochim ◽  
Eko Didik Widianto

The development of internet technology has many organizations that expanded service website. Initially used single web server that is accessible to everyone through the Internet, but when the number of users that access the web server is very much the traffic load to the web server and the web server anyway. It is necessary for the optimization of web servers to cope with the overload received by the web server when traffic is high. Methodology of this final project research include the study of literature, system design, and testing of the system. Methods from the literature reference books related as well as from several sources the internet. The design of this thesis uses Haproxy and Pound Links as a load balancing web server. The end of this reaserch is testing the network system, where the system will be tested this stage so as to create a web server system that is reliable and safe. The result is a web server system that can be accessed by many user simultaneously rapidly as load balancing Haproxy and Pound Links system which is set up as front-end web server performance so as to create a web server that has performance and high availability.


Author(s):  
Wendy Smith

The Internet, particularly through the application of World Wide Web (WWW) technology, has proved to be a very attractive medium for publishing. However, the difficulties of finding appropriate information online and then of ensuring its long term accessibility have created problems for libraries. Practices that work for books and other printed materials do not always translate directly to online materials. The National Library of Australia's PANDORA project has been set up to develop policies and procedures for ensuring long term access to significant Australian publications which are accessible only in an online environment, and to establish and maintain a permanent archive ofthat material.


2018 ◽  
Vol 57 (1) ◽  
pp. 64-80
Author(s):  
Gert-Jan Meyntjens

This article investigates the case of François Bon's pseudo-translation of Malt Olbren's The Creative Writing No-Guide (2013). If Bon believes that the making public of writing atelier practices is crucial, then why does he share his know-how by means of a pseudo-translation? Moreover, why does he limit himself to a digital version? I will first argue that Bon's choice for the digital format not only fits within his general move towards the Internet, but also has to do with the audience he targets. Then, I will show how The Creative Writing No-Guide's set-up as a pseudo-translation permits Bon not only to criticize more conventional handbooks through means of parody, but also to transmit writing tools successfully by means of what sociologist Richard Sennett calls expressive instructions.


Agent technology has developed into a sturdy instrument for e-commerce approach in recent years. The use of agent technology in e-commerce systems may address traditional e-commerce weaknesses, respond to the intelligence and individual needs of users, and significantly improve the efficiency of online transactions. There are some weaknesses in the system designed in this paper. The system will be less efficient in order to complete decentralization of the system. Every decentralized node needs to redundantly preserve a huge volume of information that not only takes up a lot of storage space however, it also makes cross-requesting and detail verification ineffective. This writing presents the evaluation of integrity of the e-commerce systems using block-chain and large amounts of data analysis. The fast growth of the Internet, in particular in the well-developed field of e-commerce, has advanced to digital marketing. In order to understand the common code generating conventional file to identify the associated event configuration, we will analyze Improved Practical Byzantine IPBF source code algorithms. The simulation shows the efficiency of the model.


10.28945/2459 ◽  
2002 ◽  
Author(s):  
Simona Cerrato

There is an increasing demand for what we can call pop-science that is pertinent scientific information dedicated to the non-specialists. This demand comes both from professional categories and the general public. Simultaneously in the scientific community there is an increasing consciousness that diffusion of the scientific information is an asset the scientific community cannot afford to overlook. The Internet is a perfect tool to meet this demand. It reaches a large and ever-increasing number of people and permits an interactive and detailed exchange of information. As an experiment of how to combine high quality services and the information technology, we have set up Ulisse - In the net of science (http://ulisse.sissa.it), an innovative Italian project for the popularisation of science via the Internet. Its main purpose is to establish a connection between scientists and the general public. Ulisse is based on three major characteristics: a) high technology to create an efficient and friendly system, b) customisation of the services, c) a network of scientists, which guaranteed the quality of the materials.


Author(s):  
Ibrahim Mahmood Ibrahim ◽  
Siddeeq Y. Ameen ◽  
Hajar Maseeh Yasin ◽  
Naaman Omar ◽  
Shakir Fattah Kak ◽  
...  

Today, web services rapidly increased and are accessed by many users, leading to massive traffic on the Internet. Hence, the web server suffers from this problem, and it becomes challenging to manage the total traffic with growing users. It will be overloaded and show response time and bottleneck, so this massive traffic must be shared among several servers. Therefore, the load balancing technologies and server clusters are potent methods for dealing with server bottlenecks. Load balancing techniques distribute the load among servers in the cluster so that it balances all web servers. The motivation of this paper is to give an overview of the several load balancing techniques used to enhance the efficiency of web servers in terms of response time, throughput, and resource utilization. Different algorithms are addressed by researchers and get good results like the pending job, and IP hash algorithms achieve better performance.


Author(s):  
Lilia Ervina Jeronimo Guterres ◽  
Ahmad Ashari

The current technology is changing rapidly, with the significant growth of the internet technology, cyber threats are becoming challenging for IT professionals in the companies and organisations to guard their system. Especially when all the hacking tools and instructions are freely available on the Internet for beginners to learn how to hack such as stealing data and information. Tic Timor IP is one of the organisations involved and engaged in the data center operation. It often gets attacks from the outside networks. A network traffic monitoring system is fundamental to detect any unknown activities happening within a network. Port scanning is one of the first methods commonly used to attack a network by utilizing several free applications such as Angry IP Scan, Nmap and Low Orbit Ion Cannon (LOIC).  On the other hand, the snort-based Intrusion Detection System (IDS) can be used to detect such attacks that occur within the network perimeter including on the web server. Based on the research result, snort has the ability to detect various types of attack including port scanning attacks and multiple snort rules can be accurately set to protect the network from any unknown threats.  


Author(s):  
Robert van Wessel ◽  
Henk J. de Vries

We all take the ubiquity of the Internet for granted: anyone, anywhere, anytime, any device, any connection, any app…but for how long? Is the future of the Internet really at stake? Discussions about control of the Internet, its architecture and of the applications running on it started more than a decade ago (Blumenthal & Clark, 2001). This topic is becoming more and more important for citizens, businesses, and governments across the world. In its original set-up, the architecture of the Internet did not favor one application over another and was based on the net neutrality principle (Wu, 2003). However, architectures should be understood an “alternative way of influencing economic systems” (Van Schewick, 2010), but they should not be a substitute for politics (Agre, 2003). The architecture is laid down in standards and therefore discussions about the future of the Internet should also address the role of standards. This is what this chapter aims to do.


Sign in / Sign up

Export Citation Format

Share Document