TY - GEN
T1 - Parameters optimization of deep learning models using Particle swarm optimization
AU - Qolomany, Basheer
AU - Maabreh, Majdi
AU - Al-Fuqaha, Ala
AU - Gupta, Ajay
AU - Benhaddou, Driss
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2017/7/19
Y1 - 2017/7/19
N2 - Deep learning has been successfully applied in several fields such as machine translation, manufacturing, and pattern recognition. However, successful application of deep learning depends upon appropriately setting its parameters to achieve high-quality results. The number of hidden layers and the number of neurons in each layer of a deep machine learning network are two key parameters, which have main influence on the performance of the algorithm. Manual parameter setting and grid search approaches somewhat ease the users' tasks in setting these important parameters. Nonetheless, these two techniques can be very time-consuming. In this paper, we show that the Particle swarm optimization (PSO) technique holds great potential to optimize parameter settings and thus saves valuable computational resources during the tuning process of deep learning models. Specifically, we use a dataset collected from a Wi-Fi campus network to train deep learning models to predict the number of occupants and their locations. Our preliminary experiments indicate that PSO provides an efficient approach for tuning the optimal number of hidden layers and the number of neurons in each layer of the deep learning algorithm when compared to the grid search method. Our experiments illustrate that the exploration process of the landscape of configurations to find the optimal parameters is decreased by 77 % - 85%. In fact, the PSO yields even better accuracy results.
AB - Deep learning has been successfully applied in several fields such as machine translation, manufacturing, and pattern recognition. However, successful application of deep learning depends upon appropriately setting its parameters to achieve high-quality results. The number of hidden layers and the number of neurons in each layer of a deep machine learning network are two key parameters, which have main influence on the performance of the algorithm. Manual parameter setting and grid search approaches somewhat ease the users' tasks in setting these important parameters. Nonetheless, these two techniques can be very time-consuming. In this paper, we show that the Particle swarm optimization (PSO) technique holds great potential to optimize parameter settings and thus saves valuable computational resources during the tuning process of deep learning models. Specifically, we use a dataset collected from a Wi-Fi campus network to train deep learning models to predict the number of occupants and their locations. Our preliminary experiments indicate that PSO provides an efficient approach for tuning the optimal number of hidden layers and the number of neurons in each layer of the deep learning algorithm when compared to the grid search method. Our experiments illustrate that the exploration process of the landscape of configurations to find the optimal parameters is decreased by 77 % - 85%. In fact, the PSO yields even better accuracy results.
KW - Deep machine learning
KW - Parameter optimization
KW - Particle swarm optimization
KW - Smart building services
UR - http://www.scopus.com/inward/record.url?scp=85027837223&partnerID=8YFLogxK
U2 - 10.1109/IWCMC.2017.7986470
DO - 10.1109/IWCMC.2017.7986470
M3 - Conference contribution
AN - SCOPUS:85027837223
T3 - 2017 13th International Wireless Communications and Mobile Computing Conference, IWCMC 2017
SP - 1285
EP - 1290
BT - 2017 13th International Wireless Communications and Mobile Computing Conference, IWCMC 2017
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 13th IEEE International Wireless Communications and Mobile Computing Conference, IWCMC 2017
Y2 - 26 June 2017 through 30 June 2017
ER -