Deep Reinforcement Learning Based Resource Allocation with Radio Remote Head Grouping and Vehicle Clustering in 5G Vehicular Networks
With increasing data traffic requirements in vehicular networks, vehicle-to-everything (V2X) communication has become imperative in improving road safety to guarantee reliable and low latency services. However, V2X communication is highly affected by interference when changing channel states in a high mobility environment in vehicular networks. For optimal interference management in high mobility environments, it is necessary to apply deep reinforcement learning (DRL) to allocate communication resources. In addition, to improve system capacity and reduce system energy consumption from the traffic overheads of periodic messages, a vehicle clustering technique is required. In this paper, a DRL based resource allocation method is proposed with remote radio head grouping and vehicle clustering to maximize system energy efficiency while considering quality of service and reliability. The proposed algorithm is compared with three existing algorithms in terms of performance through simulations, in each case outperforming the existing algorithms in terms of average signal to interference noise ratio, achievable data rate, and system energy efficiency.