Show

Summary

Load balancing is directly associated with the overall performance of a parallel and distributed computing system. Although the relevant problems in communication and computation have been well studied in data center environments, few works have considered the issues in an IoT edge scenario. In fact, processing data in a load balancing way for the latter case is more challenging. The main reason is that, unlike a data center, both the data sources and the network infrastructure in an IoT edge system can be dynamic. Moreover, with different performance requirements from IoT networks and edge servers, it will be hard to characterize the performance model and to perform runtime optimization for the whole system. To tackle this problem, in this work, we propose a load-balancing aware networking approach for efficient data processing in IoT edge systems. Specifically, we introduce an IoT network dynamic clustering solution using the emerging deep reinforcement learning (DRL), which can both fulfill the communication balancing requirements from IoT networks and the computation balancing requirements from edge servers. Moreover, we implement our system with a long short term memory (LSTM) based Dueling Double Deep Q-Learning Network (D3QN) model, and our experiments with real-world datasets collected from an autopilot vehicle demonstrate that our proposed method can achieve significant performance improvement compared to a benchmark solution.