الفهرس | Only 14 pages are availabe for public view |
Abstract In the presented work, LTE handover algorithms were implemented by modifying the TTT based on generating a random factor following a cauchy distribution. Fuzzy type-2 Q-learning optimization techniques are used for solving the handover algorithms. The main objectives of the work are to reach the minimum average number of handovers, maximize the system throughput, and minimize the system delay. The suggested algorithm (fuzzy type-2 Q-learning optimization) was simulated using MATLAB and was tested on LTE handover algorithms, while (fuzzy type-2 logic optimization) was applied on LTE-A handover algorithms, and CoMP handover algorithms. Q-learning technique shows a great efficiency in finding an optimum solution for the well-known handover algorithms. Simulation results show that the Q-learning technique has obtained less number of handovers, less total system delay while maintaining a higher total system throughput than the previous work |