Characterization and cost-efficient selection of NoC topologies for general purpose CMPs

Author(s):  
Marta Ortín ◽  
Alexandra Ferrerón ◽  
Jorge Albericio ◽  
Darío Suárez ◽  
María Villarroya-Gaudó ◽  
...  
2011 ◽  
Vol 76 (1) ◽  
pp. 88-94 ◽  
Author(s):  
Jamie S. Sanderlin ◽  
Nicole Lazar ◽  
Michael J. Conroy ◽  
Jaxk Reeves

2010 ◽  
Author(s):  
Corné Hoogendoorn ◽  
Tristan Whitmarsh ◽  
Nicolas Duchateau ◽  
Federico M. Sukno ◽  
Mathieu De Craene ◽  
...  

2015 ◽  
Vol 65 (10) ◽  
pp. A122
Author(s):  
Joseph Gibbs ◽  
Carlos Calle-Muller ◽  
Matthew Cerasale ◽  
Tarun Jain ◽  
Sagger Mawri ◽  
...  

2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Weiwei Gu ◽  
Aditya Tandon ◽  
Yong-Yeol Ahn ◽  
Filippo Radicchi

AbstractNetwork embedding is a general-purpose machine learning technique that encodes network structure in vector spaces with tunable dimension. Choosing an appropriate embedding dimension – small enough to be efficient and large enough to be effective – is challenging but necessary to generate embeddings applicable to a multitude of tasks. Existing strategies for the selection of the embedding dimension rely on performance maximization in downstream tasks. Here, we propose a principled method such that all structural information of a network is parsimoniously encoded. The method is validated on various embedding algorithms and a large corpus of real-world networks. The embedding dimension selected by our method in real-world networks suggest that efficient encoding in low-dimensional spaces is usually possible.


Sign in / Sign up

Export Citation Format

Share Document