Gölcük, İlkerÖzsoydan, Fehmi BurçinDurmaz, Esra Duygu2023-03-222023-03-2220230950-70511872-7409https://doi.org/10.1016/j.knosys.2023.110274https://hdl.handle.net/20.500.14034/637This paper proposes an improved Arithmetic Optimization Algorithm (AOA) to train artificial neural networks (ANNs) under dynamic environments. Despite many successful applications of metaheuristic training of ANNs, these studies assume static environments, which might not be realistic in real-world nonstationary processes. In this study, the training of ANNs is modeled as a dynamic optimization problem, and the proposed AOA is used to optimize connection weights and biases of the ANN under the presence of concept drift. The proposed method is designed to work for classification tasks. The performance of the proposed algorithm has been tested on twelve dynamic classification problems. Comparative analysis with state-of-the-art metaheuristic optimization algorithms has been provided. The superiority of the compared algorithms has been verified using nonparametric statistical tests. The results show that the improved AOA outperforms compared algorithms in training ANNs under dynamic environments. The findings demonstrate the potential of improved AOA for dynamic data-driven applications.(c) 2023 Elsevier B.V. All rights reserved.eninfo:eu-repo/semantics/closedAccessArithmetic optimization algorithmArtificial neural networksConcept driftDynamic optimizationDifferential Evolution AlgorithmDesignDriftAn improved arithmetic optimization algorithm for training feedforward neural networks under dynamic environmentsArticle10.1016/j.knosys.2023.110274263N/AWOS:0009256732000012-s2.0-85149730689Q1