Araştırma Makalesi
BibTex RIS Kaynak Göster

Geliştirilmiş Atom Arama Optimizasyon Algoritması ile Çok Katmanlı Algılayıcı Eğitimi

Yıl 2021, Cilt: 11 Sayı: 21, 71 - 79, 30.06.2021

Öz

Bu makalede atom arama optimizasyonu (ASO) ve benzetilmiş tavlama (SA) algoritmalarının hibritleştirilmesiyle geliştirilen ve iASO olarak isimlendirilen yeni bir hibrit algoritma ele alınmaktadır. SA tekniğinin kullanımı ile ASO algoritmasının arama yeteneği güçlendirilmiştir. Önerilen hibrit algoritmanın doğrusal olmayan sistemleri optimize etmedeki yeteneğini gözlemlemek üzere çok katmanlı algılayıcıyı (MLP) eğiticisi olarak kullanılmıştır. Iris, Balloon, XOR, Breast Cancer ve Heart olmak üzere çeşitli veri kümeleri kullanılmış ve elde edilen sonuçlar orijinal ASO, sinüs kosinüs algoritması (SCA), parçacık sürüsü optimizasyonu (PSO), karınca kolonisi optimizasyonu (ACO) ve gri kurt optimizasyonu (GWO) gibi rekabetçi algoritmalar kullanılarak oluşturulmuş diğer MLP eğiticileri ile karşılaştırılmıştır. Sonuçlar, önerilen yaklaşımla daha düşük ortalama kare hatasının (MSE) ortalama ve standart sapmasının elde edildiğini göstermiş ve dolayısıyla daha iyi performansının olduğunu açıkça göstermiştir.

Kaynakça

  • P. Bansal, S. Kumar, S. Pasrija, and S. Singh, “A hybrid grasshopper and new cat swarm optimization algorithm for feature selection and optimization of multi-layer perceptron,” Soft Comput., pp. 1–27, 2020, doi: 10.1007/s00500-020-04877-w.
  • S. Haykin, Neural networks: A comprehensive foundation, 2nd editio. Prentice Hall PTR, 1999.
  • S. Gupta and K. Deep, “A novel hybrid sine cosine algorithm for global optimization and its application to train multilayer perceptrons,” Appl. Intell., vol. 50, no. 4, pp. 993–1026, 2020, doi: 10.1007/s10489-019-01570-w.
  • A. A. Suratgar, M. B. Tavakoli, and A. Hoseinabadi, “Modified Levenberg-Marquardt method for neural networks training,” World Acad Sci Eng Technol, vol. 6, no. 1, pp. 46–48, 2005.
  • E. Eker, M. Kayri, S. Ekinci, and D. Izci, “Training Multi-Layer Perceptron Using Harris Hawks Optimization,” in 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA), 2020, pp. 1–5, doi: 10.1109/HORA49412.2020.9152874.
  • H. Faris, S. Mirjalili, and I. Aljarah, “Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme,” Int. J. Mach. Learn. Cybern., vol. 10, no. 10, pp. 2901–2920, 2019, doi: 10.1007/s13042-018-00913-2.
  • A. A. Heidari, H. Faris, S. Mirjalili, I. Aljarah, and M. Mafarja, “Ant lion optimizer: Theory, literature review, and application in multi-layer perceptron neural networks,” in Studies in Computational Intelligence, vol. 811, S. Mirjalili, J. Song Dong, and A. Lewis, Eds. Cham: Springer International Publishing, 2020, pp. 23–46.
  • M. Khishe and M. R. Mosavi, “Classification of underwater acoustical dataset using neural network trained by Chimp Optimization Algorithm,” Appl. Acoust., vol. 157, p. 107005, 2020, doi: 10.1016/j.apacoust.2019.107005.
  • A. A. Heidari, H. Faris, I. Aljarah, and S. Mirjalili, “An efficient hybrid multilayer perceptron neural network with grasshopper optimization,” Soft Comput., vol. 23, no. 17, pp. 7941–7958, 2019, doi: 10.1007/s00500-018-3424-2.
  • M. Khishe and H. Mohammadi, “Passive sonar target classification using multi-layer perceptron trained by salp swarm algorithm,” Ocean Eng., vol. 181, pp. 98–108, 2019, doi: 10.1016/j.oceaneng.2019.04.013.
  • D. Bairathi and D. Gopalani, “Numerical optimization and feed-forward neural networks training using an improved optimization algorithm: multiple leader salp swarm algorithm,” Evol. Intell., pp. 1–17, 2019, doi: 10.1007/s12065-019-00269-8.
  • Y. Yin, Q. Tu, and X. Chen, “Enhanced Salp Swarm Algorithm based on random walk and its application to training feedforward neural networks,” Soft Comput., vol. 24, no. 19, pp. 14791–14807, 2020, doi: 10.1007/s00500-020-04832-9.
  • R. García-Ródenas, L. J. Linares, and J. A. López-Gómez, “Memetic algorithms for training feedforward neural networks: an approach based on gravitational search algorithm,” Neural Comput. Appl., 2020, doi: 10.1007/s00521-020-05131-y.
  • S. Mirjalili and A. S. Sadiq, “Magnetic Optimization Algorithm for training Multi Layer Perceptron,” in 2011 IEEE 3rd International Conference on Communication Software and Networks, 2011, pp. 42–46, doi: 10.1109/ICCSN.2011.6014845.
  • B. Turkoglu and E. Kaya, “Training multi-layer perceptron with artificial algae algorithm,” Eng. Sci. Technol. an Int. J., 2020, doi: 10.1016/j.jestch.2020.07.001.
  • A. C. Cinar, “Training Feed-Forward Multi-Layer Perceptron Artificial Neural Networks with a Tree-Seed Algorithm,” Arab. J. Sci. Eng., 2020, doi: 10.1007/s13369-020-04872-1.
  • W. Zhao, L. Wang, and Z. Zhang, “Atom search optimization and its application to solve a hydrogeologic parameter estimation problem,” Knowledge-Based Syst., vol. 163, pp. 283–304, 2019, doi: 10.1016/j.knosys.2018.08.030.
  • P. Sun, Y. Zhang, J. Liu, and J. Bi, “An Improved Atom Search Optimization with Cellular Automata, a Lévy Flight and an Adaptive Weight Strategy,” IEEE Access, vol. 8, pp. 49137–49159, 2020, doi: 10.1109/ACCESS.2020.2979921.
  • P. Sun, H. Liu, Y. Zhang, L. Tu, and Q. Meng, “An intensify atom search optimization for engineering design problems,” Appl. Math. Model., vol. 89, pp. 837–859, 2021, doi: 10.1016/j.apm.2020.07.052.
  • B. Hekimoğlu, “Optimal Tuning of Fractional Order PID Controller for DC Motor Speed Control via Chaotic Atom Search Optimization Algorithm,” IEEE Access, vol. 7, pp. 38100–38114, 2019, doi: 10.1109/ACCESS.2019.2905961.
  • S. Ekinci, A. Demiroren, H. Zeynelgil, and B. Hekimoğlu, “An opposition-based atom search optimization algorithm for automatic voltage regulator system,” J. Fac. Eng. Archit. Gazi Univ., vol. 35, pp. 1141–1158, Apr. 2020, doi: 10.17341/gazimmfd.598576.
  • X. Pan, L. Xue, Y. Lu, and N. Sun, “Hybrid particle swarm optimization with simulated annealing,” Multimed. Tools Appl., vol. 78, no. 21, pp. 29921–29936, 2019, doi: 10.1007/s11042-018-6602-4.
  • F. Javidrad and M. Nazari, “A new hybrid particle swarm and simulated annealing stochastic optimization method,” Appl. Soft Comput. J., vol. 60, pp. 634–654, 2017, doi: 10.1016/j.asoc.2017.07.023.
  • C. L. Blake and C. J. Merz, “UCI Repository of machine learning databases,” University of California, 1998. http://archive.ics.uci.edu/ml/.
  • W. Zhao, L. Wang, and Z. Zhang, “A novel atom search optimization for dispersion coefficient estimation in groundwater,” Futur. Gener. Comput. Syst., vol. 91, pp. 601–610, 2019, doi: 10.1016/j.future.2018.05.037.
  • S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science (80-. )., vol. 220, no. 4598, pp. 671–680, 1983, doi: 10.1126/science.220.4598.671.
  • B. Hekimoğlu and S. Ekinci, “Optimally designed PID controller for a DC-DC buck converter via a hybrid whale optimization algorithm with simulated annealing,” Electrica, vol. 20, no. 1, pp. 19–27, 2020, doi: 10.5152/ELECTRICA.2020.19034.
  • T. Şengüler, E. Karatoprak, and S. Şeker, “A new MLP approach for the detection of the incipient bearing damage,” Adv. Electr. Comput. Eng., vol. 10, no. 3, pp. 34–39, 2010, doi: 10.4316/aece.2010.03006.
  • J. R. Zhang, J. Zhang, T. M. Lok, and M. R. Lyu, “A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training,” Appl. Math. Comput., vol. 185, no. 2, pp. 1026–1037, 2007, doi: 10.1016/j.amc.2006.07.025.
  • S. Mirjalili, “How effective is the Grey Wolf optimizer in training multi-layer perceptrons,” Appl. Intell., vol. 43, no. 1, pp. 150–161, 2015, doi: 10.1007/s10489-014-0645-7.

A Novel Improved Atom Search Optimization Algorithm for Training Multilayer Perceptron

Yıl 2021, Cilt: 11 Sayı: 21, 71 - 79, 30.06.2021

Öz

A novel hybrid algorithm developed by merging atom search optimization (ASO) and simulated annealing (SA) algorithms is presented. The search capability of ASO was improved by using simulated annealing (SA) algorithm. The proposed hybrid algorithm was named as iASO and and used for training multilayer perceptron (MLP) to observe its ability for optimizing non-linear systems. Several datasets (Iris, Balloon, XOR, Breast cancer and Heart) were used, and the obtained results were compared with respective recent competitive algorithms such as original ASO, sine cosine algorithm (SCA), particle swarm optimization (PSO), ant colony optimization (ACO) and grey wolf optimization (GWO). The results clearly indicated the performance of the proposed algorithm to be better as the lower average and standard deviation of mean square error were achieved via the proposed approach.

Kaynakça

  • P. Bansal, S. Kumar, S. Pasrija, and S. Singh, “A hybrid grasshopper and new cat swarm optimization algorithm for feature selection and optimization of multi-layer perceptron,” Soft Comput., pp. 1–27, 2020, doi: 10.1007/s00500-020-04877-w.
  • S. Haykin, Neural networks: A comprehensive foundation, 2nd editio. Prentice Hall PTR, 1999.
  • S. Gupta and K. Deep, “A novel hybrid sine cosine algorithm for global optimization and its application to train multilayer perceptrons,” Appl. Intell., vol. 50, no. 4, pp. 993–1026, 2020, doi: 10.1007/s10489-019-01570-w.
  • A. A. Suratgar, M. B. Tavakoli, and A. Hoseinabadi, “Modified Levenberg-Marquardt method for neural networks training,” World Acad Sci Eng Technol, vol. 6, no. 1, pp. 46–48, 2005.
  • E. Eker, M. Kayri, S. Ekinci, and D. Izci, “Training Multi-Layer Perceptron Using Harris Hawks Optimization,” in 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA), 2020, pp. 1–5, doi: 10.1109/HORA49412.2020.9152874.
  • H. Faris, S. Mirjalili, and I. Aljarah, “Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme,” Int. J. Mach. Learn. Cybern., vol. 10, no. 10, pp. 2901–2920, 2019, doi: 10.1007/s13042-018-00913-2.
  • A. A. Heidari, H. Faris, S. Mirjalili, I. Aljarah, and M. Mafarja, “Ant lion optimizer: Theory, literature review, and application in multi-layer perceptron neural networks,” in Studies in Computational Intelligence, vol. 811, S. Mirjalili, J. Song Dong, and A. Lewis, Eds. Cham: Springer International Publishing, 2020, pp. 23–46.
  • M. Khishe and M. R. Mosavi, “Classification of underwater acoustical dataset using neural network trained by Chimp Optimization Algorithm,” Appl. Acoust., vol. 157, p. 107005, 2020, doi: 10.1016/j.apacoust.2019.107005.
  • A. A. Heidari, H. Faris, I. Aljarah, and S. Mirjalili, “An efficient hybrid multilayer perceptron neural network with grasshopper optimization,” Soft Comput., vol. 23, no. 17, pp. 7941–7958, 2019, doi: 10.1007/s00500-018-3424-2.
  • M. Khishe and H. Mohammadi, “Passive sonar target classification using multi-layer perceptron trained by salp swarm algorithm,” Ocean Eng., vol. 181, pp. 98–108, 2019, doi: 10.1016/j.oceaneng.2019.04.013.
  • D. Bairathi and D. Gopalani, “Numerical optimization and feed-forward neural networks training using an improved optimization algorithm: multiple leader salp swarm algorithm,” Evol. Intell., pp. 1–17, 2019, doi: 10.1007/s12065-019-00269-8.
  • Y. Yin, Q. Tu, and X. Chen, “Enhanced Salp Swarm Algorithm based on random walk and its application to training feedforward neural networks,” Soft Comput., vol. 24, no. 19, pp. 14791–14807, 2020, doi: 10.1007/s00500-020-04832-9.
  • R. García-Ródenas, L. J. Linares, and J. A. López-Gómez, “Memetic algorithms for training feedforward neural networks: an approach based on gravitational search algorithm,” Neural Comput. Appl., 2020, doi: 10.1007/s00521-020-05131-y.
  • S. Mirjalili and A. S. Sadiq, “Magnetic Optimization Algorithm for training Multi Layer Perceptron,” in 2011 IEEE 3rd International Conference on Communication Software and Networks, 2011, pp. 42–46, doi: 10.1109/ICCSN.2011.6014845.
  • B. Turkoglu and E. Kaya, “Training multi-layer perceptron with artificial algae algorithm,” Eng. Sci. Technol. an Int. J., 2020, doi: 10.1016/j.jestch.2020.07.001.
  • A. C. Cinar, “Training Feed-Forward Multi-Layer Perceptron Artificial Neural Networks with a Tree-Seed Algorithm,” Arab. J. Sci. Eng., 2020, doi: 10.1007/s13369-020-04872-1.
  • W. Zhao, L. Wang, and Z. Zhang, “Atom search optimization and its application to solve a hydrogeologic parameter estimation problem,” Knowledge-Based Syst., vol. 163, pp. 283–304, 2019, doi: 10.1016/j.knosys.2018.08.030.
  • P. Sun, Y. Zhang, J. Liu, and J. Bi, “An Improved Atom Search Optimization with Cellular Automata, a Lévy Flight and an Adaptive Weight Strategy,” IEEE Access, vol. 8, pp. 49137–49159, 2020, doi: 10.1109/ACCESS.2020.2979921.
  • P. Sun, H. Liu, Y. Zhang, L. Tu, and Q. Meng, “An intensify atom search optimization for engineering design problems,” Appl. Math. Model., vol. 89, pp. 837–859, 2021, doi: 10.1016/j.apm.2020.07.052.
  • B. Hekimoğlu, “Optimal Tuning of Fractional Order PID Controller for DC Motor Speed Control via Chaotic Atom Search Optimization Algorithm,” IEEE Access, vol. 7, pp. 38100–38114, 2019, doi: 10.1109/ACCESS.2019.2905961.
  • S. Ekinci, A. Demiroren, H. Zeynelgil, and B. Hekimoğlu, “An opposition-based atom search optimization algorithm for automatic voltage regulator system,” J. Fac. Eng. Archit. Gazi Univ., vol. 35, pp. 1141–1158, Apr. 2020, doi: 10.17341/gazimmfd.598576.
  • X. Pan, L. Xue, Y. Lu, and N. Sun, “Hybrid particle swarm optimization with simulated annealing,” Multimed. Tools Appl., vol. 78, no. 21, pp. 29921–29936, 2019, doi: 10.1007/s11042-018-6602-4.
  • F. Javidrad and M. Nazari, “A new hybrid particle swarm and simulated annealing stochastic optimization method,” Appl. Soft Comput. J., vol. 60, pp. 634–654, 2017, doi: 10.1016/j.asoc.2017.07.023.
  • C. L. Blake and C. J. Merz, “UCI Repository of machine learning databases,” University of California, 1998. http://archive.ics.uci.edu/ml/.
  • W. Zhao, L. Wang, and Z. Zhang, “A novel atom search optimization for dispersion coefficient estimation in groundwater,” Futur. Gener. Comput. Syst., vol. 91, pp. 601–610, 2019, doi: 10.1016/j.future.2018.05.037.
  • S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science (80-. )., vol. 220, no. 4598, pp. 671–680, 1983, doi: 10.1126/science.220.4598.671.
  • B. Hekimoğlu and S. Ekinci, “Optimally designed PID controller for a DC-DC buck converter via a hybrid whale optimization algorithm with simulated annealing,” Electrica, vol. 20, no. 1, pp. 19–27, 2020, doi: 10.5152/ELECTRICA.2020.19034.
  • T. Şengüler, E. Karatoprak, and S. Şeker, “A new MLP approach for the detection of the incipient bearing damage,” Adv. Electr. Comput. Eng., vol. 10, no. 3, pp. 34–39, 2010, doi: 10.4316/aece.2010.03006.
  • J. R. Zhang, J. Zhang, T. M. Lok, and M. R. Lyu, “A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training,” Appl. Math. Comput., vol. 185, no. 2, pp. 1026–1037, 2007, doi: 10.1016/j.amc.2006.07.025.
  • S. Mirjalili, “How effective is the Grey Wolf optimizer in training multi-layer perceptrons,” Appl. Intell., vol. 43, no. 1, pp. 150–161, 2015, doi: 10.1007/s10489-014-0645-7.
Toplam 30 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Akademik ve/veya teknolojik bilimsel makale
Yazarlar

Davut İzci

Yayımlanma Tarihi 30 Haziran 2021
Gönderilme Tarihi 18 Aralık 2020
Yayımlandığı Sayı Yıl 2021 Cilt: 11 Sayı: 21

Kaynak Göster

APA İzci, D. (2021). Geliştirilmiş Atom Arama Optimizasyon Algoritması ile Çok Katmanlı Algılayıcı Eğitimi. EMO Bilimsel Dergi, 11(21), 71-79.
AMA İzci D. Geliştirilmiş Atom Arama Optimizasyon Algoritması ile Çok Katmanlı Algılayıcı Eğitimi. EMO Bilimsel Dergi. Haziran 2021;11(21):71-79.
Chicago İzci, Davut. “Geliştirilmiş Atom Arama Optimizasyon Algoritması Ile Çok Katmanlı Algılayıcı Eğitimi”. EMO Bilimsel Dergi 11, sy. 21 (Haziran 2021): 71-79.
EndNote İzci D (01 Haziran 2021) Geliştirilmiş Atom Arama Optimizasyon Algoritması ile Çok Katmanlı Algılayıcı Eğitimi. EMO Bilimsel Dergi 11 21 71–79.
IEEE D. İzci, “Geliştirilmiş Atom Arama Optimizasyon Algoritması ile Çok Katmanlı Algılayıcı Eğitimi”, EMO Bilimsel Dergi, c. 11, sy. 21, ss. 71–79, 2021.
ISNAD İzci, Davut. “Geliştirilmiş Atom Arama Optimizasyon Algoritması Ile Çok Katmanlı Algılayıcı Eğitimi”. EMO Bilimsel Dergi 11/21 (Haziran 2021), 71-79.
JAMA İzci D. Geliştirilmiş Atom Arama Optimizasyon Algoritması ile Çok Katmanlı Algılayıcı Eğitimi. EMO Bilimsel Dergi. 2021;11:71–79.
MLA İzci, Davut. “Geliştirilmiş Atom Arama Optimizasyon Algoritması Ile Çok Katmanlı Algılayıcı Eğitimi”. EMO Bilimsel Dergi, c. 11, sy. 21, 2021, ss. 71-79.
Vancouver İzci D. Geliştirilmiş Atom Arama Optimizasyon Algoritması ile Çok Katmanlı Algılayıcı Eğitimi. EMO Bilimsel Dergi. 2021;11(21):71-9.

EMO BİLİMSEL DERGİ
Elektrik, Elektronik, Bilgisayar, Biyomedikal, Kontrol Mühendisliği Bilimsel Hakemli Dergisi
TMMOB ELEKTRİK MÜHENDİSLERİ ODASI 
IHLAMUR SOKAK NO:10 KIZILAY/ANKARA
TEL: +90 (312) 425 32 72 (PBX) - FAKS: +90 (312) 417 38 18
bilimseldergi@emo.org.tr