metadata file
Recently Published Documents


TOTAL DOCUMENTS

25
(FIVE YEARS 11)

H-INDEX

6
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Francis J J. Ambrosio

Uploading data to Terra.bio is an essential step in the protocol for analyzing locally stored genomic sequencing data. The Terra.bio uploads page allows users to easily organize their data files using an associated metadata file via a browser-based graphical user interface. This protocol explains the process to prepare the data files and the associated metadata file for upload, and provides the link to the Terra.bio uploads page.


Author(s):  
Inga Brentel ◽  
Kristi Winters

Abstract This article details the novel structure developed to handle, harmonize and document big data for reuse and long-term preservation. ‘The Longitudinal IntermediaPlus (2014–2016)’ big data dataset is uniquely rich: it covers an array of German online media extendable to cross-media channels and user information. The metadata file for this dataset, and its documentation, were recently deposited as its own MySQL database called charmstana_sample_14-16.sql (https://data.gesis.org/sharing/#!Detail/10.7802/2030) (cs16) and is suitable for generating descriptive statistics. Analogous to the ‘Data View’ in spss, the charmstana_analysis (ca) contains the dataset’s numerical values. Both the cs16 and ca MySQL files are needed to conduct analysis on the full database. The research challenge was to process large-scaled datasets into one longitudinal, big-data data source suitable for academic research, and according to fair principles. The authors review four methodological recommendations that can serve as a framework for solving big-data structuring challenges, using the harmonization software CharmStats.


2021 ◽  
Vol 19 (1) ◽  
pp. 13
Author(s):  
Dimas Pamilih Epin Andrian ◽  
Dhomas Hatta Fudholi ◽  
Yudi Prayudi

Metadata is information in a file which its contents are an explanation of the file. Metadata contains information about the contents of data for file management purposes. In various cases involving digital evidence, investigators can uncover a case through the metadata file. Problems that arise when file metadata has changed or deleted information, for example, the moment that a file is shared via social media. Basically, all of the shared files through social media will experience changes in metadata information. This study conducted detailed analysis of changes in metadata information and hex dump values to determine the changing characteristics of metadata files shared in social media. This research method applied a comparison table to see the details of changes in metadata values from all files and social media as research objects. The results of this study are expected to have contribution for forensic analysts to identify the shared metadata characteristics of files in social media. As a result, later, the source of shared files in social media will be known. Moreover, it is expected from these findings that forensic analysts can explore the social media used by the cybercrime perpetrators.


Author(s):  
Vitthal Sadashiv Gutte ◽  
Sita Devulapalli

Correctness of data and efficient mechanisms for data security, while transferring the file to and from Cloud, are of paramount importance in today's cloud-centric processing. A mechanism for correctness and efficient transfer of data is proposed in this article. While processing users request data, a set of attributes are defined and checked. States with attributes at different levels are defined to prevent unauthorized access. Security is provided while storing the data using a chunk generation algorithm and verification of chunks using lightweight Third-Party Auditor (TPA). TPA uses digital signatures to verify user's data that are generated by RSA with MD5 algorithms. The metadata file of generated chunks is encrypted using a modified AES algorithm. The proposed method prevents unauthorized users from accessing the data in the cloud environment, in addition to maintaining data integrity. Results of the proposed cloud security model implementation are discussed.


2021 ◽  
Author(s):  
Janko Hriberšek ◽  

Transformation of BPMN business process model will be a very important topic in the future of the Hyperledger Fabric blockchain environment. Machine transformation can increase the quality of transformation and thus reduce errors. The research paper first describes BPMN and the Hyperledger Fabric environment, what smart contracts are and why they are so important. In the second part, the raw transformation model is described, where the inputs for the transformations are BPMN and the metadata file, and the result is a smart contract written in Java that can be imported into the Hyperledger Fabric environment.


2020 ◽  
Vol 5 (1) ◽  
pp. 119
Author(s):  
Erlin Erlin ◽  
Boby Hasbul Fikri ◽  
Susanti Susanti ◽  
Triyani Arita Fitri

Metadata files help user find relevant information, provides digital identification, archives and conserves stored files so that they are easily found and reused. The large number of data files on the storage media often makes the user unaware of the duplication and redundancy of the files that have an impact on the waste of storage media space, affecting the speed of a computer in the indexing process, finding or backing up data. This study employ the Latent Semantic Analysis method to detect file duplication and analyze the metadata of various file types in storage media. The findings showed that Latent Semantic Analysis method is able to detect duplicate file metadata in various types of storage media thereby further increasing the usability and speed of access of the data storage media.


2020 ◽  
Author(s):  
Adam Erickson ◽  
Benjamin Poulter ◽  
David Thompson ◽  
Gregory Okin ◽  
Shawn Serbin ◽  
...  

<p>Quantifying the capacity, and uncertainty, of proposed spaceborne hyperspectral imagers to retrieve atmospheric and surface state information is necessary to optimize future satellite architectures for their science value. Given the vast potential joint trade-and-environment-space, modeling key ‘globally representative’ points in this <em>n</em>-dimensional space is a practical solution for improving computational tractability. Given guidance from policy instruments such as the NASA Decadal Survey and the recommended Designated Target Observables, or DOs, the downselect process can be viewed as a constrained multi-objective optimization. The need to simulate imager architecture performance to achieve downselect goals has motivated the development of new mathematical models for estimating radiometric and retrieval uncertainties provided conditions analogous to real-world environments. The goals can be met with recent advances that integrate mature atmospheric inversion approaches such as Optimal Estimation (OE) that includes joint atmospheric-surface state estimation (Thompson et al. 2018) and the EnMAP end-to-end simulation tool, EeteS (Segl et al. 2012), which utilizes OE for inversions. While surface-reflectance and retrieval simulation models are normally run in isolation on local computing environments, we extend tools to enable uncertainty quantification into new representative environments and thereby increase robustness of the downselect process by providing an advanced simulation model to the broader hyperspectral imaging community in software-as-a-service (SaaS). Here, we describe and demonstrate our instrument modeling web service and corresponding hyperspectral traceability analysis (HyperTrace) library for Python. The modeling service and underlying HyperTrace OE library are deployed on the NASA DISCOVER high-performance computing (HPC) infrastructure. An intermediate HTTP server communicates between FTP and HTTP servers, providing persistent archival of model inputs and outputs for subsequent meta-analyses. To facilitate enhanced community participation, users may simply transfer a folder containing ENVI format hyperspectral imagery and a corresponding JSON metadata file to the FTP server, from which it is pulled to a NASA DISCOVER server for processing, with statistical, graphical, and ENVI-formatted results subsequently returned to the FTP server where it is available for users to download. This activity provides an expanded capability for estimating the various science values of architectures under consideration for NASA’s Surface Biology and Geology Designated Observable.</p>


2020 ◽  
Author(s):  
Jeongho Lee ◽  
Yeji Kim ◽  
Jongmin Yeom ◽  
Seonyoung Park ◽  
Youkyung Han ◽  
...  

<div id="wrapper"><form id="tform" action="/card/cardsaveall.do" method="post" name="tform"> <div class="content-wrapper"> <div id="content" class="content"> <div id="wrapper"> <div class="content-wrapper"> <div id="content" class="content"> <div> <p>This study analyzed how the co-registration accuracy varies according to the angles of sensors when satellite images are acquired. We used two-step co-registration; coarse co-registration was conducted by using multi-spectral images with relatively lower resolution, and precise fine co-registration was conducted in the region of interest by using panchromatic images. In this study the mutual information method was used because the search area for image matching is restricted by the initial coordinate data in the metadata file, and the mutual information method shows high matching performance in the small search area. We tested the method using data set of 120 combinations with 16 KOMPSAT-3‧3A images acquired in Daejeon, Korea. And we analyzed the effects on the image co-registration accuracy by image acquisition angle factors such as azimuth, incidence, and convergence angles. Experimental results showed that the convergence angle mostly affects on the co-registration accuracy among the angle factors, which shows the overall correlation coefficient with the registration accuracy as 0.59.</p> </div> </div> </div> </div> </div> </div> </form></div>


2020 ◽  
Vol 11 (1) ◽  
pp. 77-95
Author(s):  
Vitthal Sadashiv Gutte ◽  
Sita Devulapalli

Correctness of data and efficient mechanisms for data security, while transferring the file to and from Cloud, are of paramount importance in today's cloud-centric processing. A mechanism for correctness and efficient transfer of data is proposed in this article. While processing users request data, a set of attributes are defined and checked. States with attributes at different levels are defined to prevent unauthorized access. Security is provided while storing the data using a chunk generation algorithm and verification of chunks using lightweight Third-Party Auditor (TPA). TPA uses digital signatures to verify user's data that are generated by RSA with MD5 algorithms. The metadata file of generated chunks is encrypted using a modified AES algorithm. The proposed method prevents unauthorized users from accessing the data in the cloud environment, in addition to maintaining data integrity. Results of the proposed cloud security model implementation are discussed.


2019 ◽  
Vol 5 (2) ◽  
pp. 140
Author(s):  
Rachmad Fitriyanto ◽  
Anton Yudhana ◽  
Sunardi Sunardi

Management of jpeg/exif file fingerprint with Brute Force string matching algorithm and Hash Function SHA256Metode pengamanan berkas gambar jpeg/exif saat ini hanya mencakup aspek pencegahan, belum pada aspek deteksi integritas data. Digital Signature Algorithm (DSA) adalah metode kriptografi yang digunakan untuk memverifikasi integritas data menggunakan hash value. SHA256 merupakan hash function yang menghasilkan 256-bit hash value yang berfungsi sebagai file fingerprint. Penelitian ini bertujuan untuk menyusun file fingerprint dari berkas jpeg/exif menggunakan SHA256 dan algoritma Brute Force string matching untuk verifikasi integritas berkas jpeg/exif. Penelitian dilakukan dalam lima tahap. Tahap pertama adalah identifikasi struktur berkas jpeg/exif. Tahap kedua adalah akuisisi konten segmen. Tahap ketiga penghitungan hash value. Tahap keempat adalah eksperimen modifikasi berkas jpeg/exif. Tahap kelima adalah pemilihan elemen dan penyusunan file fingerprint. Hasil penelitian menunjukkan sebuah jpeg/exif file fingerprint tersusun atas tiga hash value. SOI (Start of Image) segment hash value digunakan untuk mendeteksi terjadinya modifikasi berkas dalam bentuk perubahan tipe berkas dan penambahan objek pada konten gambar. Hash value segmen APP1 digunakan untuk mendeteksi modifikasi pada metadata berkas. Hash value segmen SOF0 digunakan untuk mendeteksi gambar yang dimodifikasi dengan teknik recoloring, resizing, dan cropping. The method of securing jpeg/exif image files currently has covered only the prevention aspect instead of the data integrity detection aspect. Digital Signature Algorithm is a cryptographic method used to verify the data integrity using hash value. SHA256 is a hash function that produces a 256-bit hash value functioning as a fingerprint file. This study aimed at compiling fingerprint files from jpeg/exif files using SHA256 and Brute Force string matching algorithm to verify the integrity of jpeg/exif files. The research was conducted in five steps. The first step was identifying the jpeg/exif file structure. The second step was the acquisition of the segment content. The third step was calculating the hash value. The fourth step was the jpeg/exif file modification experiment. The fifth step was the selection of elements and compilation of fingerprint files. The obtained results showed a jpeg/exif fingerprint file which was compiled in three hash values. The hash value of SOI segment was used to detect the occurrence of file modification in the form of file type changing and object addition on the image content. The hash value of APP1 segment was used to detect the metadata file modification. The hash value of SOF0 segment was used to detect the images modified by recoloring, resizing, and cropping techniques.


Sign in / Sign up

Export Citation Format

Share Document