file size
Recently Published Documents


TOTAL DOCUMENTS

246
(FIVE YEARS 96)

H-INDEX

17
(FIVE YEARS 2)

2021 ◽  
Vol 5 (6) ◽  
pp. 1099-1105
Author(s):  
Desta Yolanda ◽  
Mohammad Hafiz Hersyah ◽  
Eno Marozi

Security monitoring systems using face recognition can be applied to CCTV or IP cameras. This is intended to improve the security system and make it easier for users to track criminals is theft. The experiment was carried out by detecting human faces for 24 hours using different cameras, namely an HD camera that was active during the day and a Night Vision camera that was active at night. The application of Unsupervised Learning method with the concept of an image cluster, aims to distinguish the faces of known or unknown people according to the dataset built in the Raspberry Pi 4. The user interface media of this system is a web-based application built with Python Flask and Python MySQL. This application can be accessed using the domain provided by the IP Forwarding device which can be accessed anywhere. According to the test results on optimization of storage, the system is able to save files only when a face is detected with an average file size of ± 2.28 MB for 1x24 hours of streaming. So that this storage process becomes more efficient and economical compared to the storage process for CCTV or IP cameras in general.


2021 ◽  
Author(s):  
Nehal M. Atallah ◽  
Michael S. Toss ◽  
Clare Verrill ◽  
Manuel Salto-Tellez ◽  
David Snead ◽  
...  

AbstractUsing digitalized whole slide images (WSI) in routine histopathology practice is a revolutionary technology. This study aims to assess the clinical impacts of WSI quality and representation of the corresponding glass slides. 40,160 breast WSIs were examined and compared with their corresponding glass slides. The presence, frequency, location, tissue type, and the clinical impacts of missing tissue were assessed. Scanning time, type of the specimens, time to WSIs implementation, and quality control (QC) measures were also considered. The frequency of missing tissue ranged from 2% to 19%. The area size of the missed tissue ranged from 1–70%. In most cases (>75%), the missing tissue area size was <10% and peripherally located. In all cases the missed tissue was fat with or without small entrapped normal breast parenchyma. No missing tissue was identified in WSIs of the core biopsy specimens. QC measures improved images quality and reduced WSI failure rates by seven-fold. A negative linear correlation between the frequency of missing tissue and both the scanning time and the image file size was observed (p < 0.05). None of the WSI with missing tissues resulted in a change in the final diagnosis. Missing tissue on breast WSI is observed but with variable frequency and little diagnostic consequence. Balancing between WSI quality and scanning time/image file size should be considered and pathology laboratories should undertake their own assessments of risk and provide the relevant mitigations with the appropriate level of caution.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Jing Qiu ◽  
Xiaoxu Geng ◽  
Guanglu Sun

Firmware formats vary from vendor to vendor, making it difficult to track which vendor or device the firmware belongs to, or to identify the firmware used in an embedded device. Current firmware analysis tools mainly distinguish firmware by static signatures in the firmware binary code. However, the extraction of a signature often requires careful analysis by professionals to obtain it and requires a significant investment of time and effort. In this paper, we use Doc2Vec to extract and process the character information in firmware, combine the file size, file entropy, and the arithmetic mean of bytes as firmware features, and implement the firmware classifier by combining the Extra Trees model. The evaluation is performed on 1,190 firmware files from 5 router vendors. The accuracy of the classifier is 97.18%, which is higher than that of current approaches. The results show that the proposed approach is feasible and effective.


Cryptography ◽  
2021 ◽  
Vol 5 (4) ◽  
pp. 37
Author(s):  
Noha E. El-Attar ◽  
Doaa S. El-Morshedy ◽  
Wael A. Awad

The need for cloud storage grows day after day due to its reliable and scalable nature. The storage and maintenance of user data at a remote location are severe issues due to the difficulty of ensuring data privacy and confidentiality. Some security issues within current cloud systems are managed by a cloud third party (CTP), who may turn into an untrustworthy insider part. This paper presents an automated Encryption/Decryption System for Cloud Data Storage (AEDS) based on hybrid cryptography algorithms to improve data security and ensure confidentiality without interference from CTP. Three encryption approaches are implemented to achieve high performance and efficiency: Automated Sequential Cryptography (ASC), Automated Random Cryptography (ARC), and Improved Automated Random Cryptography (IARC) for data blocks. In the IARC approach, we have presented a novel encryption strategy by converting the static S-box in the AES algorithm to a dynamic S-box. Furthermore, the algorithms RSA and Twofish are used to encrypt the generated keys to enhance privacy issues. We have evaluated our approaches with other existing symmetrical key algorithms such as DES, 3DES, and RC2. Although the two proposed ARC and ASC approaches are more complicated, they take less time than DES, DES3, and RC2 in processing the data and obtaining better performance in data throughput and confidentiality. ARC outperformed all of the other algorithms in the comparison. The ARC’s encrypting process has saved time compared with other algorithms, where its encryption time has been recorded as 22.58 s for a 500 MB file size, while the DES, 3DES, and RC2 have completed the encryption process in 44.43, 135.65, and 66.91 s, respectively, for the same file size. Nevertheless, when the file sizes increased to 2.2 GB, the ASC proved its efficiency in completing the encryption process in less time.


2021 ◽  
Vol 5 (2) ◽  
pp. 187-195
Author(s):  
Ayu Shafira Tubagus ◽  
◽  
Rizal Saepul Mahdi ◽  
Adhi Rizal ◽  
Aries Suharso ◽  
...  

Video applications consume more energy on the Internet and can be accessed by electronic devices, due to an increase in the consumption of high-resolution and high-quality video content, presenting serious issues to delivery infrastructure that needs higher video compression technologies. The focus of this paper is to evaluate the quality of the most current codec, AV1, to its predecessor codec. The comparison was made experimentally at two video resolutions (1080p and 720p) by sampling video frames with various CRF/CQP values and testing several parameters analyses such as encoding duration, compression ratio, bit rate, Mean Square Error (MSE), and Peak Signal to Noise Ratio (PSNR). The AV1 codec is very great in terms of quality and file size, even though it is slower in terms of compression speed. The H.265/HEVC codec, on the other side, beats the other codec in terms of compression ratio. In conclusion, the H.265/HEVC codec is suggested as a material for obtaining a well compressed video with small file size and a short time.


2021 ◽  
Author(s):  
Johannes Hevler

crosslinking.m: Method to run cross-linking samples on TimsTof pro instrument. Method and energies are specifically optimized for PhoX cross-linker reagent (by Richard Scheltema and Markus Lubeck (Brucker)) Standard_DDA_PASEF_1.1sec_cycletime_2segm_1st_15min_nospectra.m: Method for classical bottom-up (proteomics) experiments. Optimized for LC systems that are operated without a trap and are euqipped with a 5 µL sample loop (flowrate 0.400 µL/min), as the Method is segmented: The first 15 min of the method no spectra are saved. To further reduce the file size the noise filtering for Tims is turned on. (by Richard Scheltema and Markus Lubeck (Brucker))


Author(s):  
Dohyoen Lee ◽  
Giltae Song

Abstract Motivation Over the past decades, vast amounts of genome sequencing data have been produced, requiring an enormous level of storage capacity. The time and resources needed to store and transfer such data cause bottlenecks in genome sequencing analysis. To resolve this issue, various compression techniques have been proposed to reduce the size of original FASTQ raw sequencing data, but these remain suboptimal. Long-read sequencing has become dominant in genomics, whereas most existing compression methods focus on short-read sequencing only. Results We designed a compression algorithm based on read reordering using a novel scoring model for reducing FASTQ file size with no information loss. We integrated all data processing steps into a software package called FastqCLS and provided it as a Docker image for ease of installation and execution to help users easily install and run. We compared our method with existing major FASTQ compression tools using benchmark datasets. We also included new long-read sequencing data in this validation. As a result, FastqCLS outperformed in terms of compression ratios for storing long-read sequencing data. Availability and implementation FastqCLS can be downloaded from https://github.com/krlucete/FastqCLS Supplementary information Supplementary data are available at Bioinformatics online.


2021 ◽  
Vol 2 (1) ◽  
pp. 1-14
Author(s):  
Ramesh Paudyal ◽  
Subarna Shakya

Due to the rapid technological advancement, traditional e-government systems are getting obsolete because of their inherent limitation of interoperability and accessibility to the highly secured and flexible e-governance services. Migration of such systems into highly secured cloud governance architecture will be a long-term viable solution. However, the adoption of distributed cloud computing has created operational and security challenges. This research work aims to bridge the gap between traditional and cloud-based e-Government systems in terms of data security based on confidentiality, interoperability, and mobility of data among distributed databases of cloud computing environments. In this work, we have created two organization databases by the use of AWS EC2 instances and classified the data based on the Risk Impact Level (RIL) of data by the use of the Metadata Attribute Value (MAV) function. To enhance further security on classified data, we take appropriate security action based on the sensitivity of the data. For the analysis purpose, we implemented different security algorithms, i.e. AES, DES, and RSA in the mobility of data between two distributed cloud databases. We measured the encryption and decryption time along with the file size of data before and after classification. AES performed better while considering the encryption time and file size, but the overall performance of RSA was better for smaller file sizes. Finally, the performance of the data mobility between two distributed clouds databases was analyzed while maintaining the sensitivity level of the data.


2021 ◽  
Vol 10 (4) ◽  
pp. 2192-2200
Author(s):  
Theda Flare Ginoy Quilala ◽  
Rogel L. Quilala

This study analyzed and enhanced the modified Blowfish algorithm (MBA) encryption. The modification retained the original structure, process and the use of two S-boxes in the MBA but presented two derivation processes in the f-function which was originally placed to prevent symmetry. The derivation case’s performance was analyzed using avalanche effect and time efficiency. After comparing the first and second derivation process presented in the MBA, the second derivation further improved the avalanche effect by 5.47%, thus improving security. The performance also showed that the second modification is faster by 39.48% in encryption time, and 38.34% faster in decryption time. The first derivation case in the modified Blowfish was slower in time because of the difference in the placement of the shift rotation. The key generation time was found to be independent of the input size while the encryption and decryption time was found to be directly proportional to file size. With this, the second modification is considered to be better.


Sign in / Sign up

Export Citation Format

Share Document