scholarly journals Hadoop distributed file system mechanism for processing of large datasets across computers cluster using programming techniques

Author(s):  
Nicholas Jain Edwards ◽  
David Tonny Brain ◽  
Stephen Carinna Joly ◽  
Mariana Karry Masucato

In this paper, we have proved that the HDFS I/O operations performance is getting increased by integrating the set associativity in the cache design and changing the pipeline topology using fully connected digraph network topology. In read operation, since there is huge number of locations (words) at cache compared to direct mapping the chances of miss ratio is very low, hence reducing the swapping of the data between main memory and cache memory. This is increasing the memory I/O operations performance. In Write operation instead of using the sequential pipeline we need to construct the fully connected graph using the data blocks listed from the NameNode metadata. In sequential pipeline, the data is getting copied to source node in the pipeline. Source node will copy the data to next data block in the pipeline. The same copy process will continue until the last data block in the pipeline. The acknowledgment process has to follow the same process from last block to source block. The time required to transfer the data to all the data blocks in the pipeline and the acknowledgment process is almost 2n times to data copy time from one data block to another data block (if the replication factor is n).

2013 ◽  
Vol 17 (3-4) ◽  
pp. 485-506 ◽  
Author(s):  
Penglin Dai ◽  
Qingfeng Zhuge ◽  
Xianzhang Chen ◽  
Weiwen Jiang ◽  
Edwin H.-M. Sha

2016 ◽  
Vol 16 (7&8) ◽  
pp. 541-587
Author(s):  
Nathan Wiebe ◽  
Ashish Kapoor ◽  
Krysta M. Svore

In recent years, deep learning has had a profound impact on machine learning and artificial intelligence. At the same time, algorithms for quantum computers have been shown to efficiently solve some problems that are intractable on conventional, classical computers. We show that quantum computing not only reduces the time required to train a deep restricted Boltzmann machine, but also provides a richer and more comprehensive framework for deep learning than classical computing and leads to significant improvements in the optimization of the underlying objective function. Our quantum methods also permit efficient training of multilayer and fully connected models.


Cryptography ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 18
Author(s):  
Yutian Gui ◽  
Chaitanya Bhure ◽  
Marcus Hughes ◽  
Fareena Saqib

Direct Memory Access (DMA) is a state-of-the-art technique to optimize the speed of memory access and to efficiently use processing power during data transfers between the main system and a peripheral device. However, this advanced feature opens security vulnerabilities of access compromise and to manipulate the main memory of the victim host machine. The paper outlines a lightweight process that creates resilience against DMA attacks minimal modification to the configuration of the DMA protocol. The proposed scheme performs device identification of the trusted PCIe devices that have DMA capabilities and constructs a database of profiling time to authenticate the trusted devices before they can access the system. The results show that the proposed scheme generates a unique identifier for trusted devices and authenticates the devices. Furthermore, a machine learning–based real-time authentication scheme is proposed that enables runtime authentication and share the results of the time required for training and respective accuracy.


Author(s):  
S. Mehdi Vahidipour ◽  
Mohammad Reza Meybodi ◽  
Mehdi Esnaashari

Shortest path problem in stochastic graphs has been recently studied in the literature and a number of algorithms has been provided to find it using varieties of learning automata models. However, all these algorithms suffer from two common drawbacks: low speed and lack of a clear termination condition. In this paper, we propose a novel learning automata-based algorithm for this problem which can speed up the process of finding the shortest path using parallelism. For this parallelism, several traverses are initiated, in parallel, from the source node towards the destination node in the graph. During each traverse, required times for traversing from the source node up to any visited node are estimated. The time estimation at each visited node is then given to the learning automaton residing in that node. Using different time estimations provided by different traverses, this learning automaton gradually learns which neighbor of the node is on the shortest path. To set a condition for the termination of the proposed algorithm, we analyze the algorithm using a recently introduced model, Adaptive Stochastic Petri Net (ASPN-LA). The results of this analysis enable us to establish a necessary condition for the termination of the algorithm. To evaluate the performance of the proposed algorithm in comparison to the existing algorithms, we apply it to find the shortest path in six different stochastic graphs. The results of this evaluation indicate that the time required for the proposed algorithm to find the shortest path in all graphs is substantially shorter than that required by similar existing algorithms.


Author(s):  
Fusheng Xiong ◽  
Michael Kuby ◽  
Wayne D. Frasch

An asymmetric, fully-connected 8-city traveling salesman problem (TSP) was solved by DNA computing using the ordered node pair abundance (ONPA) approach through the use of pair ligation probe quantitative real time polymerase chain reaction (PLP-qPCR). The validity of using ONPA to derive the optimal answer was confirmed by in silico computing using a reverse-engineering method to reconstruct the complete tours in the feasible answer set from the measured ONPA. The high specificity of the sequence-tagged hybridization, and ligation that results from the use of PLPs significantly increased the accuracy of answer determination in DNA computing. When combined with the high throughput efficiency of qPCR, the time required to identify the optimal answer to the TSP was reduced from days to 25 min.


Author(s):  
Charles TurnbiLL ◽  
Delbert E. Philpott

The advent of the scanning electron microscope (SCEM) has renewed interest in preparing specimens by avoiding the forces of surface tension. The present method of freeze drying by Boyde and Barger (1969) and Small and Marszalek (1969) does prevent surface tension but ice crystal formation and time required for pumping out the specimen to dryness has discouraged us. We believe an attractive alternative to freeze drying is the critical point method originated by Anderson (1951; for electron microscopy. He avoided surface tension effects during drying by first exchanging the specimen water with alcohol, amy L acetate and then with carbon dioxide. He then selected a specific temperature (36.5°C) and pressure (72 Atm.) at which carbon dioxide would pass from the liquid to the gaseous phase without the effect of surface tension This combination of temperature and, pressure is known as the "critical point" of the Liquid.


Author(s):  
O. E. Bradfute

Electron microscopy is frequently used in preliminary diagnosis of plant virus diseases by surveying negatively stained preparations of crude extracts of leaf samples. A major limitation of this method is the time required to survey grids when the concentration of virus particles (VPs) is low. A rapid survey of grids for VPs is reported here; the method employs a low magnification, out-of-focus Search Mode similar to that used for low dose electron microscopy of radiation sensitive specimens. A higher magnification, in-focus Confirm Mode is used to photograph or confirm the detection of VPs. Setting up the Search Mode by obtaining an out-of-focus image of the specimen in diffraction (K. H. Downing and W. Chiu, private communications) and pre-aligning the image in Search Mode with the image in Confirm Mode facilitates rapid switching between Modes.


Author(s):  
Anthony S-Y Leong ◽  
David W Gove

Microwaves (MW) are electromagnetic waves which are commonly generated at a frequency of 2.45 GHz. When dipolar molecules such as water, the polar side chains of proteins and other molecules with an uneven distribution of electrical charge are exposed to such non-ionizing radiation, they oscillate through 180° at a rate of 2,450 million cycles/s. This rapid kinetic movement results in accelerated chemical reactions and produces instantaneous heat. MWs have recently been applied to a wide range of procedures for light microscopy. MWs generated by domestic ovens have been used as a primary method of tissue fixation, it has been applied to the various stages of tissue processing as well as to a wide variety of staining procedures. This use of MWs has not only resulted in drastic reductions in the time required for tissue fixation, processing and staining, but have also produced better cytologic images in cryostat sections, and more importantly, have resulted in better preservation of cellular antigens.


1999 ◽  
Vol 4 (5) ◽  
pp. 4-7 ◽  
Author(s):  
Laura Welch

Abstract Functional capacity evaluations (FCEs) have become an important component of disability evaluation during the past 10 years to assess an individual's ability to perform the essential or specific functions of a job, both preplacement and during rehabilitation. Evaluating both job performance and physical ability is a complex assessment, and some practitioners are not yet certain that an FCE can achieve these goals. An FCE is useful only if it predicts job performance, and factors that should be assessed include overall performance; consistency of performance across similar areas of the FCE; consistency between observed behaviors during the FCE and limitations or abilities reported by the worker; objective changes (eg, blood pressure and pulse) that are appropriate relative to performance; external factors (illness, lack of sleep, or medication); and a coefficient of variation that can be measured and assessed. FCEs can identify specific movement patterns or weaknesses; measure improvement during rehabilitation; identify a specific limitation that is amenable to accommodation; and identify a worker who appears to be providing a submaximal effort. FCEs are less reliable at predicting injury risk; they cannot tell us much about endurance over a time period longer than the time required for the FCE; and the FCE may measure simple muscular functions when the job requires more complex ones.


Sign in / Sign up

Export Citation Format

Share Document