compressed storage
Recently Published Documents


TOTAL DOCUMENTS

38
(FIVE YEARS 5)

H-INDEX

6
(FIVE YEARS 0)

Author(s):  
B. Satheesh, Et. al.

Mining of regular trends in group action databases, time series databases, and lots of different database types was popularly studied in data processing research. Most previous studies follow the generation-and-test method of associate degree Apriori-like candidate collection. In this study, we seem to propose a particular frequency tree like structure, which is associated degree of prefix-tree like structure that is extended to be used for compressed storage, crucial knowledge of the frequency pattern, associated degrees create an economic FP-tree mining methodology, FP growth, by the growth of pattern fragments for the mining of the entire set of frequent patterns. Three different mining techniques are used to outsize the information which is compressed into small structures such as FP-tree that avoids repetitive information scans, cost. The proposed FP-tree-based mining receives an example philosophy of section creation to stay away from the exorbitant age of several competitor sets, and an apportioning-based, separating and-overcoming technique is used to divide the mining task into a contingent knowledge base for restricted mining designs that effectively reduces the investigation field.


2020 ◽  
Vol 198 ◽  
pp. 04032
Author(s):  
Changhu Zhou ◽  
Zhenyu Chen ◽  
Xiaotian Lv ◽  
Derun Gao ◽  
Mengxue Zhao

—Garbage sorting is related to many issues such as living environment, resource conservation and social civilization. Aiming at the problem of garbage sorting, an intelligent sorting trash dustbin was designed on the mechanical structure designed by ourselves, using the STM32F103ZET6 chip, the LD3320 speech recognition module, and the ultrasonic module. The garbage dustbin realizes the identification of garbage types, the compressed storage of recyclable garbage, and automatic bag sealing. This design not only protects the environment, reduces the risk of disease, but also makes resources recyclable, which indirectly brings unexpected benefits to humans.


2018 ◽  
Vol 7 (2.19) ◽  
pp. 80
Author(s):  
G D.Kesavan ◽  
P N.Karthikayan

Using cache memory the overall memory access time to fetch data gets reduced. As use of cache memory is related to a    system's performance, the caching process should take less time. To speed up caching process, there are many cache optimization techniques available. Some of the cache optimization process are Reducing Miss Rate, Reducing Miss Penalty, Re-ducing the time to hit in the cache etc. Re-cent advancement paved way for compressing data in cache, accessing recent data use pat-tern etc. All the techniques focus on increasing cache capacity or replacement policies in cache resulting in more hit ratio. There are many cache related compression and optimization techniques available which address only capacity and replacement related              optimization and their related issues. This paper deals with scheduling the requests of cache memory as per compressed cache organization. So that cache searching and indexing speed gets reduced considerably and service the request in a faster manner. For capacity and replacement improvements Dictionary sharing based caching is used. Through this scheme multiple requests are foreseen using pre-fetcher and are searched as per cache organization, promoting easier indexing process.The benefit comes from both compressed storage and also easier storage ac-cess. 


2017 ◽  
Vol 72 ◽  
pp. 179-204 ◽  
Author(s):  
Susana Ladra ◽  
José R. Paramá ◽  
Fernando Silva-Coira

Sign in / Sign up

Export Citation Format

Share Document