Throughput Maximization of Delay-Aware DNN Inference in Edge Computing by Exploring DNN Model Partitioning and Inference Parallelism
Keyword(s):
Keyword(s):
2020 ◽
Vol 140
(9)
◽
pp. 1030-1039
Keyword(s):
2019 ◽
Vol E102.B
(5)
◽
pp. 1037-1044
2021 ◽
Vol E104.A
(1)
◽
pp. 336-342