1. Home
  2. Browse by Author

Browsing by Author "Tor, Ali Hakan"

Filter results by typing the first few letters
Now showing 1 - 3 of 3
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    Article
    Comparative assessment of smooth and non-smooth optimization solvers in HANSO software
    (Balikesir University, 2022) Tor, Ali Hakan; AGÜ, Mühendislik Fakültesi, Mühendislik Bilimleri Bölümü; Tor, Ali Hakan
    The aim of this study is to compare the performance of smooth and nonsmooth optimization solvers from HANSO (Hybrid Algorithm for Nonsmooth Optimization) software. The smooth optimization solver is the implementation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and the nonsmooth optimization solver is the Hybrid Algorithm for Nonsmooth Optimization. More precisely, the nonsmooth optimization algorithm is the combination of the BFGS and the Gradient Sampling Algorithm (GSA). We use well-known collection of academic test problems for nonsmooth optimization containing both convex and nonconvex problems. The motivation for this research is the importance of the comparative assessment of smooth optimization methods for solving nonsmooth optimization problems. This assessment will demonstrate how successful is the BFGS method for solving nonsmooth optimization problems in comparison with the nonsmooth optimization solver from HANSO. Performance profiles using the number iterations, the number of function evaluations and the number of subgradient evaluations are used to compare solvers. © 2021 Balikesir University. All rights reserved.
  • Loading...
    Thumbnail Image
    Article
    An introduction to non-smooth convex analysis via multiplicative derivative
    (TAYLOR & FRANCIS LTD, 2-4 PARK SQUARE, MILTON PARK, ABINGDON OR14 4RN, OXON, ENGLAND, 2019) Tor, Ali Hakan; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Tor, Ali Hakan
    In this study, *-directional derivative and *-subgradient are defined using the multiplicative derivative, making a new contribution to non-Newtonian calculus for use in non-smooth analysis. As for directional derivative and subgradient, which are used in the non-smooth optimization theory, basic definitions and preliminary facts related to optimization theory are stated and proved, and the *-subgradient concept is illustrated by providing some examples, such as absolute value and exponential functions. In addition, necessary and sufficient optimality conditions are obtained for convex problems.
  • Loading...
    Thumbnail Image
    Article
    A Modified Multiple Shooting Algorithm for Parameter Estimation in ODEs Using Adjoint Sensitivity Analysis
    (ELSEVIER SCIENCE INCSTE 800, 230 PARK AVE, NEW YORK, NY 10169, 2021) Aydogmus, Ozgur; Tor, Ali Hakan; 0000-0003-3193-5004; 0000-0002-9463-7197; AGÜ, Bilgisayar Bilimleri Fakültesi, Bilgisayar Bilimleri Fakültesi/Faculty of Computer Sciences
    To increase the predictive power of a model, one needs to estimate its unknown parameters. Almost all parameter estimation techniques in ordinary differential equation models suffer from either a small convergence region or enormous computational cost. The method of multiple shooting, on the other hand, takes its place in between these two extremes. The computational cost of the algorithm is mostly due to the calculation of directional derivatives of objective and constraint functions. Here we modify the multiple shooting algorithm to use the adjoint method in calculating these derivatives. In the literature, this method is known to be a more stable and computationally efficient way of computing gradients of scalar functions. A predator-prey system is used to show the performance of the method and supply all necessary information for a successful and efficient implementation. (C) 2020 Elsevier Inc. All rights reserved.