Abstract:The task assignment of distributed intrusion detection system (DIDS) in the edge computing environment with limited node performance is a typical resource-constrained project scheduling problem (RCPSP). To solve this problem, a task scheduling scheme based on Markov Decision Process (MDP) is proposed. First, the performance of the DIDS detection engines and the load generated by the packets are scientifically evaluated; then the state space and action space of the model are constructed. Finally, a state-behavior value function is established to determine the optimal strategy for maintaining a low-load state of DIDS. According to this strategy, the scheduler can perform scheduling matching between detection engines of different performance levels and data packets of different load levels. In addition, in order to solve the problem of increased packet loss rate caused by sporadic traffic surges, a method to balance the two contradictory indicators of low load and packet loss rate is proposed. Experimental results show that the proposed scheme enables DIDS to dynamically adjust the scheduling strategy during network changes, maintain the overall low load of the system, while security indicators are not significantly reduced compared with other algorithms.