weak locality
Recently Published Documents


TOTAL DOCUMENTS

5
(FIVE YEARS 1)

H-INDEX

1
(FIVE YEARS 0)

2021 ◽  
pp. 1-36
Author(s):  
Khabat Soltanian ◽  
Ali Ebnenasir ◽  
Mohsen Afsharchi

Abstract This paper presents a novel method, called Modular Grammatical Evolution (MGE), towards validating the hypothesis that restricting the solution space of NeuroEvolution to modular and simple neural networks enables the efficient generation of smaller and more structured neural networks while providing acceptable (and in some cases superior) accuracy on large data sets. MGE also enhances the state-of-the-art Grammatical Evolution (GE) methods in two directions. First, MGE's representation is modular in that each individual has a set of genes, and each gene is mapped to a neuron by grammatical rules. Second, the proposed representation mitigates two important drawbacks of GE, namely the low scalability and weak locality of representation, towards generating modular and multi-layer networks with a high number of neurons. We define and evaluate five different forms of structures with and without modularity using MGE and find single-layer modules with no coupling more productive. Our experiments demonstrate that modularity helps in finding better neural networks faster. We have validated the proposed method using ten well-known classification benchmarks with different sizes, feature counts, and output class counts. Our experimental results indicate that MGE provides superior accuracy with respect to existing NeuroEvolution methods and returns classifiers that are significantly simpler than other machine learning generated classifiers. Finally, we empirically demonstrate that MGE outperforms other GE methods in terms of locality and scalability properties.



2019 ◽  
Vol 09 (09) ◽  
pp. 1108-1113
Author(s):  
晓培 陈


2018 ◽  
Vol 1 (1) ◽  
pp. 23-26
Author(s):  
Arjun Singh Saud

Least recently used (LRU) makes bold assumption on recency factor only which made LRU miss behave with weak locality workloads. If the “frequency”, of each page reference is taken into consideration, it will perform better in the case where workload has weak locality. Frequency count leads to serious problem after a long duration of reference stream because it cannot cope with change in locality. Reuse distance or inter reference recency (IRR) of a block is equal to number of distinct pages accessed between recent consecutive or correlated access of that particular block. Many recent variations of LRU use IRR rather than recency such that LRU can be made friendly with weak locality workloads. This papers surveys LRU variants that use IRR to make page replacement decision.





Sign in / Sign up

Export Citation Format

Share Document