Tor, Ali Hakan

Loading...
Profile Picture
Name Variants
Ali Hakan TOR
Tor, Ali Hakan
Job Title
Dr. Öğr. Üyesi
Email Address
hakan.tor@agu.edu.tr
Main Affiliation
02.01. Mühendislik Bilimleri
Status
Current Staff
Website
Scopus Author ID
Turkish CoHE Profile ID
Google Scholar ID
WoS Researcher ID

Sustainable Development Goals

15

LIFE ON LAND
LIFE ON LAND Logo

1

Research Products
Documents

8

Citations

35

h-index

3

Documents

5

Citations

27

Scholarly Output

3

Articles

3

Views / Downloads

14/0

Supervised MSc Theses

0

Supervised PhD Theses

0

WoS Citation Count

12

Scopus Citation Count

13

WoS h-index

2

Scopus h-index

2

Patents

0

Projects

0

WoS Citations per Publication

4.00

Scopus Citations per Publication

4.33

Open Access Source

3

Supervised Theses

0

Google Analytics Visitor Traffic

JournalCount
Applied Mathematics and Computation1
International Journal of Optimization and Control - Theories & Applications – IJOCTA1
Journal of Taibah University for Science1
Current Page: 1 / 1

Scopus Quartile Distribution

Competency Cloud

GCRIS Competency Cloud

Scholarly Output Search Results

Now showing 1 - 3 of 3
  • Article
    Comparative Assessment of Smooth and Non-Smooth Optimization Solvers in Hanso Software
    (Ramazan Yaman, 2022) Tor, Ali Hakan
    The aim of this study is to compare the performance of smooth and nonsmooth mization) software. The smooth optimization solver is the implementation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and the nonsmooth optimization solver is the Hybrid Algorithm for Nonsmooth Optimization. More precisely, the nonsmooth optimization algorithm is the combination of the BFGS and the Gradient Sampling Algorithm (GSA). We use well-known collection of academic test problems for nonsmooth optimization containing both convex and nonconvex problems. The motivation for this research is the importance of the comparative assessment of smooth optimization methods for solving nonsmooth optimization problems. This assessment will demonstrate how successful is the BFGS method for solving nonsmooth optimization problems in comparison with the nonsmooth optimization solver from HANSO. Performance profiles using the number iterations, the number of function evaluations and the number of subgradient evaluations are used to compare solvers.
  • Article
    Citation - WoS: 10
    Citation - Scopus: 11
    A Modified Multiple Shooting Algorithm for Parameter Estimation in ODEs Using Adjoint Sensitivity Analysis
    (Elsevier Science inc, 2021) Aydogmus, Ozgur; Tor, Ali Hakan
    To increase the predictive power of a model, one needs to estimate its unknown parameters. Almost all parameter estimation techniques in ordinary differential equation models suffer from either a small convergence region or enormous computational cost. The method of multiple shooting, on the other hand, takes its place in between these two extremes. The computational cost of the algorithm is mostly due to the calculation of directional derivatives of objective and constraint functions. Here we modify the multiple shooting algorithm to use the adjoint method in calculating these derivatives. In the literature, this method is known to be a more stable and computationally efficient way of computing gradients of scalar functions. A predator-prey system is used to show the performance of the method and supply all necessary information for a successful and efficient implementation. (C) 2020 Elsevier Inc. All rights reserved.
  • Article
    Citation - WoS: 2
    Citation - Scopus: 2
    An Introduction to Non-Smooth Convex Analysis via Multiplicative Derivative
    (Taylor & Francis Ltd, 2019) Tor, Ali Hakan
    In this study, *-directional derivative and *-subgradient are defined using the multiplicative derivative, making a new contribution to non-Newtonian calculus for use in non-smooth analysis. As for directional derivative and subgradient, which are used in the non-smooth optimization theory, basic definitions and preliminary facts related to optimization theory are stated and proved, and the *-subgradient concept is illustrated by providing some examples, such as absolute value and exponential functions. In addition, necessary and sufficient optimality conditions are obtained for convex problems.