output file
Recently Published Documents


TOTAL DOCUMENTS

45
(FIVE YEARS 9)

H-INDEX

4
(FIVE YEARS 1)

Drones ◽  
2022 ◽  
Vol 6 (1) ◽  
pp. 24
Author(s):  
Taleatha Pell ◽  
Joan Y. Q. Li ◽  
Karen E. Joyce

With the increased availability of low-cost, off-the-shelf drone platforms, drone data become easy to capture and are now a key component of environmental assessments and monitoring. Once the data are collected, there are many structure-from-motion (SfM) photogrammetry software options available to pre-process the data into digital elevation models (DEMs) and orthomosaics for further environmental analysis. However, not all software packages are created equal, nor are their outputs. Here, we evaluated the workflows and output products of four desktop SfM packages (AgiSoft Metashape, Correlator3D, Pix4Dmapper, WebODM), across five input datasets representing various ecosystems. We considered the processing times, output file characteristics, colour representation of orthomosaics, geographic shift, visual artefacts, and digital surface model (DSM) elevation values. No single software package was determined the “winner” across all metrics, but we hope our results help others demystify the differences between the options, allowing users to make an informed decision about which software and parameters to select for their specific application. Our comparisons highlight some of the challenges that may arise when comparing datasets that have been processed using different parameters and different software packages, thus demonstrating a need to provide metadata associated with processing workflows.


Author(s):  
Ahmad Mohamad Al-Smadi ◽  
Ahmad Al-Smadi ◽  
Roba Mahmoud Ali Aloglah ◽  
Nisrein Abu-darwish ◽  
Ahed Abugabah

The Vernam-cipher is known as a one-time pad of algorithm that is an unbreakable algorithm because it uses a typically random key equal to the length of data to be coded, and a component of the text is encrypted with an element of the encryption key. In this paper, we propose a novel technique to overcome the obstacles that hinder the use of the Vernam algorithm. First, the Vernam and advance encryption standard AES algorithms are used to encrypt the data as well as to hide the encryption key; Second, a password is placed on the file because of the use of the AES algorithm; thus, the protection record becomes very high. The Huffman algorithm is then used for data compression to reduce the size of the output file. A set of files are encrypted and decrypted using our methodology. The experiments demonstrate the flexibility of our method, and it’s successful without losing any information.


2021 ◽  
Author(s):  
Kevin Riehl ◽  
Cristian Riccio ◽  
Eric Alexander Miska ◽  
Martin Hemberg

Motivation: Most genomes harbor a large number of transposons, and they play an important role in evolution and gene regulation. They are also of interest to clinicians as they are involved in several diseases, including cancer and neurodegeneration. Although several methods for transposon identification are available, they are often highly specialised towards specific tasks or classes of transposons, and they lack common standards such as a unified taxonomy scheme and output file format. Moreover, many methods are difficult to install, poorly documented, and difficult to reproduce. Results: We present TransposonUltimate, a powerful bundle of three modules for transposon classification, annotation, and detection of transposition events. TransposonUltimate comes as a Conda package under the GPL-3.0 licence, is well documented and it is easy to install. We benchmark the classification module on the large TransposonDB covering over 891,051 sequences to demonstrate that it outperforms the currently best existing solutions. The annotation and detection modules combine sixteen existing softwares, and we illustrate its use by annotating Caenorhabditis elegans, Rhizophagus irregularis and Oryza sativa subs. japonica genomes. Finally, we use the detection module to discover 29,554 transposition events in the genomes of twenty wild type strains of Caenorhabditis elegans. Availability: Running software and source code available on https://github.com/DerKevinRiehl/TransposonUltimate and findings can be downloaded from https://cellgeni.cog.sanger.ac.uk/browser.html?shared=transposonultimate/.


F1000Research ◽  
2020 ◽  
Vol 9 ◽  
pp. 1211
Author(s):  
Dustin B. Miller ◽  
Stephen R. Piccolo

A compound heterozygous (CH) variant occurs when a person inherits two alternate alleles, one from each parent, and these alleles occur at different positions within the same gene. Therefore, CH variant identification requires distinguishing maternally from paternally derived nucleotides, a process that requires numerous computational tools. Using such tools can be challenging and often introduce unforeseen challenges such as installation procedures that are operating-system specific, software dependencies, and format requirements for input files. To overcome these challenges, we developed Compound Heterozygous Variant Identification Pipeline (CompoundHetVIP), which uses a single Docker image to encapsulate commonly used software tools for phasing, annotating, and analyzing CH, homozygous alternate, and de novo variants in a series of 13 steps. To begin using our tool, researchers need only install the Docker engine and download the CompoundHetVIP Docker image. The tools provided in CompoundHetVIP can be applied to Illumina whole-genome sequencing data of individual samples or trios (a child and both parents), using VCF or gVCF files as initial input. Each step of the pipeline produces an analysis-ready output file that can be further evaluated. To illustrate its use, we applied CompoundHetVIP to data from a publicly available Ashkenazim trio and identified two genes with candidate CH variants and one gene with a candidate homozygous alternate variant after filtering. While this example uses genomic data from a healthy child, we anticipate that most researchers will use CompoundHetVIP to uncover missing heritability in human diseases and other phenotypes. CompoundHetVIP is open-source software and can be found at https://github.com/dmiller903/CompoundHetVIP; this repository also provides detailed, step-by-step examples.


2020 ◽  
Vol 4 (2) ◽  
pp. 445
Author(s):  
Maria Rosario Borroek ◽  
Errissya Rasywir ◽  
Yovi Pratama

Software effort estimation is to estimate the amount of resources needed in developing the software. For that software effort estimation is important so need to see the effect of software measurement to software effort estimation which is done by machine learning technique. Based on this the researcher tries to build a system capable of measuring software. In this study experiments on software measurement techniques (FPA, FPA with Sugeno fuzzy and FPA with mamdani fuzzy). The three types of techniques are compared with the three project data for further software effort estimation. For evaluation, this study evaluates using the assessment of the Developeras Analyst of the Project. The results of the study that the LOC and effort values on a similar system can be different if calculated by the use of FPA, Fam Mamdany fuzzy and FPA Sugeno Fuzzy. The highest LOC and Effort values are generated by FPA Mamdany Fuzzy on Project DUMAS POLDA SUMSEL. While the lowest effort value and lowest LOC produced by FPA Sugeno Fuzzy. This can be traced from the calculation mechanisms performed by FPA Sugeno Fuzzy where this method does not count the input, output, file, query and interface values at all. The calculation of FPA Sugeno fuzzy is done by roughly judging only from the difficulty of making the system. To raise the price of a project in order to be rewarded higher FAT methods Mamdani Fuzzy is recommended


In this paper we present a software system for cryptographic information protection based on deterministic chaos. The program system functionality includes encryption and decryption of text and graphic information on the basis of a random number generator, which is played by the Lorentz attractor. The use of an attractor in this program system guarantees randomness and absolute randomness when issuing numbers, limited only by the initial parameters. It is also necessary to transfer encryption parameters, excluding the possibility of its interception, because Encryption parameters are used in the program as decryption keys. After the process of text encryption, the program performs a frequency analysis of the input and output files. Accordingly, the frequency of a certain group of letters, which is vulnerable to frequency analysis, must be clearly expressed in the input file. In the output file, the frequency should be stable and uniform, thereby proving the effectiveness of encryption.


2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Kyle Ellrott ◽  
Alex Buchanan ◽  
Allison Creason ◽  
Michael Mason ◽  
Thomas Schaffter ◽  
...  

Abstract Challenges are achieving broad acceptance for addressing many biomedical questions and enabling tool assessment. But ensuring that the methods evaluated are reproducible and reusable is complicated by the diversity of software architectures, input and output file formats, and computing environments. To mitigate these problems, some challenges have leveraged new virtualization and compute methods, requiring participants to submit cloud-ready software packages. We review recent data challenges with innovative approaches to model reproducibility and data sharing, and outline key lessons for improving quantitative biomedical data analysis through crowd-sourced benchmarking challenges.


The main objective of the Project is the “Behaviour Of Self-Supporting Communication Tower Subjected To Wind Load” The thesis deals with the study of the behaviours of self supporting tower under static and dynamic loading cases. The study is extended for the behaviour of tower in both face and corner wind0cases. As wind force plays a major role as far as loading is considered in tall structures. For the details study of this project a case study is done0considering a 100m high self supporting tower. In addition to the wind loading the loads of the communication antennas that will come on the tower are also considered. It describes the wind load calculations, analysis procedure and design of tower members. The tower is analyzed for both corner wind and face wind directions. Analysis if performed using SAP 80/90. The basic activity starts with modelling of the tower. Modelling of tower is an assembly structure with different configuration parts stands one above the other, each we call it as BAY. For this come typical bay types are formatted which could be used as data for generation of tower configuration. These typical configurations shall help in generation0of easy model with different options in achieving the required parameters of optimum design results of0minimum tonnage of the total structure with allowable deflections at maximum wind in both static and dynamic cases. Wind load and antenna load calculations0are carried out by using ‘C’ program. Tower configuration, wind load calculations on tower, dimensions and properties of the tower (input to ‘C’) are explained by taking typical Bay-1. For analysis of tower, the required formatted input for Structural0Analysis Program (SAP) is obtained by ‘C’) program output. Member forces are obtained by genol 1.f3f file (SAP output file). All tower members are designed as axially0loaded compression members as per standard specifications. Design of members are carried out for maximum0member forces obtained by both corner and face wind analysis. If capacity of member is less than member force higher sections0are selected and analysis & design is carried out until member capacity is more than member force. Output (analysis & design data for all members is obtained using member0forces file and “Design” C program file. This output is shown in Tables. Foundation forces and joint displacements are obtained by SAP output file (genol.sol) and the maximum deflection is checked as per specifications. In the tower structures, the leg members are governed0when the wind acts in corner direction. Similarly the horizontal and diagonal members are governed when wind acts in face direction. The results tabulated shows0the forces that are changing in static and dynamic loading.


2019 ◽  
Vol 214 ◽  
pp. 05037
Author(s):  
Guilherme Amadio ◽  
Philippe Canal ◽  
Enrico Guiraud ◽  
Danilo Piparo

Experiments at the Large Hadron Collider (LHC) produce tens of petabytes of new data in ROOT format per year that need to be processed and analysed. In the next decade, following the planned upgrades of the LHC and its detectors, this data production rate is expected to increase at least ten-fold. Therefore, optimizing the ROOT I/O subsystem is of critical importance to the success of the LHC physics programme. This contribution presents ROOT’s approach for writing data from multiple threads to a single output file in an efficient way. Technical aspects of the implementation—the TBufferMerger class—and programming model examples are described. Measurements of runtime performance and the overall improvement relative to the case of serial data writing are also discussed.


Electronics ◽  
2018 ◽  
Vol 7 (11) ◽  
pp. 270 ◽  
Author(s):  
Ying Yuan ◽  
Jun-Ho Huh

Following the development of the Industrial Revolution 4.0, many new types of systems are being designed, introduced, or attempted, even in almost every traditional industry. The clothing industry is no exception. The use of continuously developing production equipment and Information and Communication Technology (ICT) has a single objective, providing a customized service to all customers. Thus, in this study, the primary research task was to identify ill-balanced aspects or disadvantages of the services previously analyzed to construct a more complete online customized service. This was accomplished by analyzing an automated Computer-Aided Design (CAD) output file containing customer requirements regarding individual clothing items. The secondary research task was to plan and design a clothing manufacturing process to which a one-person one-item mass production system has been applied to achieve a customized service. As a result, for the primary research task, the customers’ requirements for each dress were reflected in attributes, such as color, pattern, or size, and it was possible to obtain an automated CAD output file for each element. Such CAD output files can be used in the production process directly. To find the possibility of upgrading the existing dressmaking process and implement the one-person one-item system, the entire manufacturing process was simulated for the test.


Sign in / Sign up

Export Citation Format

Share Document