Bilgisayar Mühendisliği Bölümü Koleksiyonu

Permanent URI for this collectionhttps://hdl.handle.net/20.500.12573/203

Browse

Recent Submissions

Now showing 1 - 20 of 305
  • Article
    An effective colorectal polyp classification for histopathological images based on supervised contrastive learning
    (Elsevier, 2024) Yengec-Tasdemir,Sena Busra; Aydin,Zafer; Akay,Ebru; Doğan,Serkan; Yilmaz,Bulent; 0000-0001-7686-6298; AGÜ, Fen Bilimleri Enstitüsü, Elektrik ve Bilgisayar Mühendisliği Ana Bilim Dalı; Aydın, Zafer; Yilmaz, Bulent
    Early detection of colon adenomatous polyps is pivotal in reducing colon cancer risk. In this context, accurately distinguishing between adenomatous polyp subtypes, especially tubular and tubulovillous, from hyperplastic variants is crucial. This study introduces a cutting-edge computer-aided diagnosis system optimized for this task. Our system employs advanced Supervised Contrastive learning to ensure precise classification of colon histopathology images. Significantly, we have integrated the Big Transfer model, which has gained prominence for its exemplary adaptability to visual tasks in medical imaging. Our novel approach discerns between in-class and out-of-class images, thereby elevating its discriminatory power for polyp subtypes. We validated our system using two datasets: a specially curated one and the publicly accessible UniToPatho dataset. The results reveal that our model markedly surpasses traditional deep convolutional neural networks, registering classification accuracies of 87.1% and 70.3% for the custom and UniToPatho datasets, respectively. Such results emphasize the transformative potential of our model in polyp classification endeavors
  • Article
    Integrating Biological Domain Knowledge with Machine Learning for Identifying Colorectal-Cancer-Associated Microbial Enzymes in Metagenomic Data
    (MDPI, 2025) Bakir-Gungor, Burcu; Ersoz, Nur Sebnem; Yousef, Malik; 0000-0002-2272-6270; AGÜ, Yaşam ve Doğa Bilimleri Fakültesi, Moleküler Biyoloji ve Genetik Bölümü; Bakir-Gungor, Burcu; Ersoz, Nur Sebnem
    Advances in metagenomics have revolutionized our ability to elucidate links between the microbiome and human diseases. Colorectal cancer (CRC), a leading cause of cancer-related mortality worldwide, has been associated with dysbiosis of the gut microbiome. This study aims to develop a method for identifying CRC-associated microbial enzymes by incorporating biological domain knowledge into the feature selection process. Conventional feature selection techniques often evaluate features individually and fail to leverage biological knowledge during metagenomic data analysis. To address this gap, we propose the enzyme commission (EC)-nomenclature-based Grouping-Scoring-Modeling (G-S-M) method, which integrates biological domain knowledge into feature grouping and selection. The proposed method was tested on a CRC-associated metagenomic dataset collected from eight different countries. Community-level relative abundance values of enzymes were considered as features and grouped based on their EC categories to provide biologically informed groupings. Our findings in randomized 10-fold cross-validation experiments imply that glycosidases, CoA-transferases, hydro-lyases, oligo-1,6-glucosidase, crotonobetainyl-CoA hydratase, and citrate CoA-transferase enzymes can be associated with CRC development as part of different molecular pathways. These enzymes are mostly synthesized by Eschericia coli, Salmonella enterica, Klebsiella pneumoniae, Staphylococcus aureus, Streptococcus pneumoniae, and Clostridioides dificile. Comparative evaluation experiments showed that the proposed model consistently outperforms traditional feature selection methods paired with various classifiers.
  • Article
    CompreCity: Accelerating the Traveling Salesman Problem on GPU with data compression
    (ELSEVIER, 2025) Yalcin, Salih; Usul, Hamdi Burak; Yalcin, Gulay; 0000-0001-6476-4542; 0000-0003-3929-8126; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Yalcin, Salih; Yalcin, Gulay
    Traveling Salesman Problem (TSP) is one of the significant problems in computer science which tries to find the shortest path for a salesman who needs to visit a set of cities and it is involved in many computing problems such as networks, genome analysis, logistics etc. Using parallel executing paradigms, especially GPUs, is appealing in order to reduce the problem solving time of TSP. One of the main issues in GPUs is to have limited GPU memory which would not be enough for the entire data. Therefore, transferring data from the host device would reduce the performance in execution time. In this study, we applied three data compression methodologies to represent cities in the TSP such as (1) Using Greatest Common Divisor (2) Shift Cities to the Origin (3) Splitting Surface to Grids. Therefore, we include more cities in GPU memory and reduce the number of data transfers from the host device. We implement our methodology in Iterated Local Search (ILS) algorithm with 2-opt and The Lin-Kernighan-Helsgaun (LKH) Algorithm. We show that our implementation presents more than 25% performance improvement for both algorithms.
  • Article
    Network intrusion detection based on machine learning strategies: performance comparisons on imbalanced wired, wireless, and software-defined networking (SDN) network traffics
    (TÜBİTAK Academic Journals, 2024) Hacilar, Hilal; Aydin, Zafer; Gungor, Vehbi Cagri; 0000-0025-8116-722; 0000-0001-7686-6298; 0000-0003-0803-8372; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Hacilar, Hilal; Aydin, Zafer; Gungor, Vehbi Cagri
    The rapid growth of computer networks emphasizes the urgency of addressing security issues. Organizations rely on network intrusion detection systems (NIDSs) to protect sensitive data from unauthorized access and theft. These systems analyze network traffic to detect suspicious activities, such as attempted breaches or cyberattacks. However, existing studies lack a thorough assessment of class imbalances and classification performance for different types of network intrusions: wired, wireless, and software-defined networking (SDN). This research aims to fill this gap by examining these networks' imbalances, feature selection, and binary classification to enhance intrusion detection system efficiency. Various techniques such as SMOTE, ROS, ADASYN, and SMOTETomek are used to handle imbalanced datasets. Additionally, eXtreme Gradient Boosting (XGBoost) identifies key features, and an autoenco der (AE) assists in feature extraction for the classification task. The study evaluates datasets such as AWID, UNSW, and InSDN, yielding the best results with different numbers of selected features. Bayesian optimization fine-tunes parameters, and diverse machine learning algorithms (SVM, kNN, XGBoost, random forest, ensemble classifiers, and autoencoders) are employed. The optimal results, considering F1-measure, overall accuracy, detection rate, and false alarm rate, have been achieved for the UNSW-NB15, preprocessed AWID, and InSDN datasets, with values of [0.9356, 0.9289, 0.9328, 0.07597], [0.997, 0.9995, 0.9999, 0.0171], and [0.9998, 0.9996, 0.9998, 0.0012], respectively. These findings demonstrate that combining Bayesian optimization with oversampling techniques significantly enhances classification performance across wired, wireless, and SDN networks when compared to previous research conducted on these datasets.
  • Article
    CSA-DE-LR: enhancing cardiovascular disease diagnosis with a novel hybrid machine learning approach
    (PEERJ INC, 2024) Dedeturk, Beyhan Adanur; Dedeturk, Bilge Kagan; Bakir-Gungor, Burcu; 0000-0003-4983-2417; 0000-0002-8026-5003; 0000-0002-2272-6270; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Beyhan Adanur, Dedeturk; Bakir-Gungor, Burcu
    Cardiovascular diseases (CVD) are a leading cause of mortality globally, necessitating the development of efficient diagnostic tools. Machine learning (ML) and metaheuristic algorithms have become prevalent in addressing these challenges, providing promising solutions in medical diagnostics. However, traditional ML approaches often need to be improved in feature selection and optimization, leading to suboptimal performance in complex diagnostic tasks. To overcome these limitations, this study introduces a new hybrid method called CSA-DE-LR, which combines the clonal selection algorithm (CSA) and differential evolution (DE) with logistic regression. This integration is designed to optimize logistic regression weights efficiently for the accurate classification of CVD. The methodology employs three optimization strategies based on the F1 score, the Matthews correlation coefficient (MCC), and the mean absolute error (MAE). Extensive evaluations on benchmark datasets, namely Cleveland and Statlog, reveal that CSA-DELR outperforms state-of-the-art ML methods. In addition, generalization is evaluated using the Breast Cancer Wisconsin Original (WBCO) and Breast Cancer Wisconsin Diagnostic (WBCD) datasets. Significantly, the proposed model demonstrates superior efficacy compared to previous research studies in this domain. This study's findings highlight the potential of hybrid machine learning approaches for improving diagnostic accuracy, offering a significant advancement in the fields of medical data analysis and CVD diagnosis.
  • Article
    Can artificial intelligence algorithms recognize knee arthroplasty implants from X-ray radiographs?
    (MediHealth Academy, 2023) Gölgelioğlu, Fatih; Aşkın, Aydoğan; Gündoğdu, Mehmet Cihat; Uzun, Mehmet Fatih; Dedeturk, Bilge Kagan; Yalın, Mustafa; 0000-0002-8026-5003; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Dedetürk, Bilge Kağan
    Aims: This study aimed to investigate the use of a convolutional neural network (CNN) deep learning approach to accurately identify total knee arthroplasty (TKA) implants from X-ray radiographs. Methods: This retrospective study employed a deep learning CNN system to analyze pre-revision and post-operative knee X-rays from TKA patients. We excluded cases involving unicondylar and revision knee replacements, as well as low-quality or unavailable X-ray images and those with other implants. Ten cruciate-retaining TKA replacement models were assessed from various manufacturers. The training set comprised 69% of the data, with the remaining 31% in the test set, augmented due to limited images. Evaluation metrics included accuracy and F1 score, and we developed the software in Python using the TensorFlow library for the CNN method. A computer scientist with AI expertise managed data processing and testing, calculating specificity, sensitivity, and accuracy to assess CNN performance. Results: In this study, a total of 282 AP and lateral X-rays from 141 patients were examined, encompassing 10 distinct knee prosthesis models from various manufacturers, each with varying X-ray counts. The CNN technique exhibited flawless accuracy, achieving a 100% identification rate for both the manufacturer and model of TKA across all 10 different models. Furthermore, the CNN method demonstrated exceptional specificity and sensitivity, consistently reaching 100% for each individual implant model. Conclusion: This study underscores the impressive capacity of deep learning AI algorithms to precisely identify knee arthroplasty implants from X-ray radiographs. It highlights AI’s ability to detect subtle changes imperceptible to humans, execute precise computations, and handle extensive data. The accurate recognition of knee replacement implants using AI algorithms prior to revision surgeries promises to enhance procedure efficiency and outcomes.
  • Article
    Machine Learning based Network Intrusion Detection with Hybrid Frequent Item Set Mining
    (GAZİ ÜNİVERSİTESİ, 2024) Fırat, Murat; Bakal, Gokhan; Akbas, Ayhan; 0000-0003-2897-3894; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Bakal, Gokhan
    With the development and expansion of computer networks day by day and the diversity of software developed, the damage that possible attacks can cause is increasing beyond the predictions. Intrusion Detection Systems (STS/IDS) are one of the practical defense tools against these potential attacks that are constantly growing and diversifying. Thus, one of the emerging methods among researchers is to train these systems with various artificial intelligence methods to detect subsequent attacks in real time and take the necessary precautions. However, the ultimate goal is to propose a hybrid feature selection approach to improve the classification performance. The raw dataset originally enclosed 85 descriptor features (attributes) for classification. These attributes are extracted using CICFlowMeter from a PCAP file where network traffic is recorded for data curation. In this study, classical feature selection methods and frequent item set mining approaches were employed in feature selection for constructing a hybrid model. We aimed to examine the effect of the proposed hybrid feature selection approach on the classification task for the network traffic data containing ordinary and attack records. The outcomes demonstrate that the proposed method gained nearly 3% improvement when applied with the Logistic Regression algorithm on classifying more than 225,000 records.
  • Article
    Review of feature selection approaches based on grouping of features
    (PeerJ, 2023) Kuzudisli,Cihan; Gungor-Bakır, Burcu; Bulut, Nurten; Qaqish, Behjat; Yousef, Malik; 0000-0002-1895-8749; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Gungor-Bakır, Burcu; Bulut, Nurten
    With the rapid development in technology, large amounts of high-dimensional data have been generated. This high dimensionality including redundancy and irrelevancy poses a great challenge in data analysis and decision making. Feature selection (FS) is an effective way to reduce dimensionality by eliminating redundant and irrelevant data. Most traditional FS approaches score and rank each feature individually; and then perform FS either by eliminating lower ranked features or by retaining highly ranked features. In this review, we discuss an emerging approach to FS that is based on initially grouping features, then scoring groups of features rather than scoring individual features. Despite the presence of reviews on clustering and FS algorithms, to the best of our knowledge, this is the first review focusing on FS techniques based on grouping. The typical idea behind FS through grouping is to generate groups of similar features with dissimilarity between groups, then select representative features from each cluster. Approaches under supervised, unsupervised, semi supervised and integrative frameworks are explored. The comparison of experimental results indicates the effectiveness of sequential, optimization-based (i.e., fuzzy or evolutionary), hybrid and multi-method approaches. When it comes to biological data, the involvement of external biological sources can improve analysis results. We hope this works findings can guide effective design of new FS approaches using feature grouping.
  • conferenceobject.listelement.badge
    Graph-based Biomedical Knowledge Discovery
    (IEEE, 2024) Altuner, Osman; Bakir-Gungor, Burcu; Bakal, Gokhan; 0000-0003-2897-3894; 0000-0002-2272-6270; AGÜ, Mühendislik Fakültesi, Elektrik - Elektronik Mühendisliği Bölümü; Altuner, Osman; Bakir-Gungor, Burcu; Bakal, Gökhan
    Dijitalleşme süreci tüm dünyada oldukça yüksek bir hızla ilerlemektedir. Bu durum günümüz yaşantısında bir çok kolaylık sağladığı gibi ortaya çıkan devasa dijital verilerin analizi ve işlenmesi gibi bir problemi de beraberinde getirmektedir. Bu durum yayınlanan akademik çalışmalar için de geçerlidir. Bu anlamda çalışmalar dahilinde bulunan yenilikçi bilgilere ulaşmak için her bir çalışmayı değerlendirme süreci oldukça zahmetli bir süreci gerektirmektedir. Bu sebeple yapılan bu çalışmada hedef hastalıklar özelinde elde edilmiş yayınlar metin analiz süreçleriyle analiz edilmiş ve anlamlı terimlerin biyomedikal ilişkiler üzerinden bağlanmasını sağlayan çizge yapısına dönüştürülmüştür. Elde edilen yoğun çizge yapısı üzerinde treats (tedavi edici), causes (sebep verici), associated_with (ilişkili) gibi önemli bağlantılara sahip ikili biyomedikal varlıklar sorgulanmıştır. Sorgu sonuçlarına göre elde edilen varlık ikilileri manuel arama yöntemiyle de teyit edilmiş ve gerçek bağlantılar olduğu ispatlanmıştır. Bu çalışmayla birlikte, bilinen biyomedikal varlıkların önerilen yaklaşımla elde edilmesi uzun zaman gerektiren manuel arama problemini çözmesi hedeflenmektedir. Ayrıca birden fazla ikili bağlantı örüntüleriyle bilinmeyen/keşfedilmemiş olası yeni ilişkiler (tedavi edici, sebep verici, ilişkili vb.) elde etme potansiyeli de bulunmaktadır.
  • Article
    Aguhyper: a hyperledger-based electronic health record management framework
    (PEERJ INC, 2024) Dedeturk, Beyhan Adanur; Bakir-Gungor, Burcu; 0000-0003-4983-2417; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Dedeturk, Beyhan Adanur; Bakir-Gungor, Burcu
    The increasing importance of healthcare records, particularly given the emergence of new diseases, emphasizes the need for secure electronic storage and dissemination. With these records dispersed across diverse healthcare entities, their physical maintenance proves to be excessively time-consuming. The prevalent management of electronic healthcare records (EHRs) presents inherent security vulnerabilities, including susceptibility to attacks and potential breaches orchestrated by malicious actors. To tackle these challenges, this article introduces AguHyper, a secure storage and sharing solution for EHRs built on a permissioned blockchain framework. AguHyper utilizes Hyperledger Fabric and the InterPlanetary Distributed File System (IPFS). Hyperledger Fabric establishes the blockchain network, while IPFS manages the off-chain storage of encrypted data, with hash values securely stored within the blockchain. Focusing on security, privacy, scalability, and data integrity, AguHyper’s decentralized architecture eliminates single points of failure and ensures transparency for all network participants. The study develops a prototype to address gaps identified in prior research, providing insights into blockchain technology applications in healthcare. Detailed analyses of system architecture, AguHyper’s implementation configurations, and performance assessments with diverse datasets are provided. The experimental setup incorporates CouchDB and the Raft consensus mechanism, enabling a thorough comparison of system performance against existing studies in terms of throughput and latency. This contributes significantly to a comprehensive evaluation of the proposed solution and offers a unique perspective on existing literature in the field.
  • Article
    Document Classification with Contextually Enriched Word Embeddings
    (Bajece (İstanbul Teknik Ünv), 2024) Mahmood, Raad Saadi; Bakal, Gokhan; Akbas, Ayhan; 0000-0003-2897-3894; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Bakal, Mehmet Gokhan
    The text classification task has a wide range of application domains for distinct purposes, such as the classification of articles, social media posts, and sentiments. As a natural language processing application, machine learning and deep learning techniques are intensively utilized in solving such challenges. One common approach is employing the discriminative word features comprising Bag-of-Words and n-grams to conduct text classification experiments. The other powerful approach is exploiting neural network-based (specifically deep learning models) through either sentence, word, or character levels. In this study, we proposed a novel approach to classify documents with contextually enriched word embeddings powered by the neighbor words accessible through the trigram word series. In the experiments, a well-known web of science dataset is exploited to demonstrate the novelty of the models. Consequently, we built various models constructed with and without the proposed approach to monitor the models' performances. The experimental models showed that the proposed neighborhood-based word embedding enrichment has decent potential to use in further studies.
  • Article
    A Comparative Analysis of Convolutional Neural Network Architectures for Binary Image Classification: A Case Study in Skin Cancer Detection
    (Giresun Üniversitesi, 2024) Korkut, Şerife Gül; Kocabaş, Hatice; Kurban, Rifat; 0009-0007-1398-0924; 0009-0003-1760-0555; 0000-0002-0277-2210; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Korkut, Şerife Gül; Kocabaş, Hatice; Kurban, Rifat
    In this study, a comprehensive comparative analysis of Convolutional Neural Network (CNN) architectures for binary image classification is presented with a particular focus on the benefits of transfer learning. The performance and accuracy of prominent CNN models, including MobileNetV3, VGG19, ResNet50, and EfficientNetB0, in classifying skin cancer from binary images are evaluated. Using a pre-trained approach, the impact of transfer learning on the effectiveness of these architectures and identify their strengths and weaknesses within the context of binary image classification are investigated. This paper aims to provide valuable insights for selecting the optimal CNN architecture and leveraging transfer learning to achieve superior performance in binary image classification applications, particularly those related to medical image analysis.
  • Article
    MULTILEVEL THRESHOLDING FOR BRAIN MR IMAGE SEGMENTATION USING SWARM-BASED OPTIMIZATION ALGORITHMS
    (Kahramanmaraş Sütçü İmam Üniversitesi, 2024) Toprak, Ahmet Nusret; Şahin, Ömür; Kurban, Rifat; 0000-0002-0277-2210; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Kurban, Rifat
    Image segmentation, the process of dividing an image into various sets of pixels called segments, is an essential technique in image processing. Image segmentation reduces the complexity of the image and makes it easier to analyze by dividing the image into segments. One of the simplest yet powerful ways of image segmentation is multilevel thresholding, in which pixels are segmented into multiple regions according to their intensities. This study aims to explore and compare the performance of the well-known swarm-based optimization algorithms on the multilevel thresholding-based image segmentation task using brain MR images. Seven swarm-based optimization algorithms: Particle Swarm Optimization (PSO), Artificial Bee Colony (ABC), Gray Wolf Optimizer (GWO), MothFlame Optimization (MFO), Ant Lion Optimization (ALO), Whale Optimization (WOA), and Jellyfish Search Optimizer (JS) algorithms are compared by applying to brain MR images to determine threshold levels. In the experiments carried out with mentioned algorithms, minimum cross-entropy, and between-class variance objective functions were employed. Extensive experiments show that JS, ABC, and PSO algorithms outperform others.
  • Article
    ENHANCING DEEP LEARNING PERFORMANCE THROUGH A GENETIC ALGORITHM-ENHANCED APPROACH: FOCUSING ON LSTM
    (Kahramanmaraş Sütçü İmam Üniversitesi, 2024) Şen, Tarık Üveys; Bakal, Gokhan; 0009-0000-0297-6064; 0000-0003-2897-3894; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Şen, Tarık Üveys; Bakal, Gokhan
    Deep learning has shown remarkable success in various applications, such as image classification, natural language processing, and speech recognition. However, training deep neural networks is challenging due to their complex architecture and the number of parameters required. Genetic algorithms have been proposed as an alternative optimization technique for deep learning, offering an efficient alternative way to find an optimal set of network parameters that minimize the objective function. In this paper, we propose a novel approach integrating genetic algorithms with deep learning, specifically LSTM models, to enhance performance. Our method optimizes crucial hyper-parameters including learning rate, batch size, neuron count per layer, and layer depth through genetic algorithms. Additionally, we conduct a comprehensive analysis of how genetic algorithm parameters influence the optimization process and illustrate their significant impact on improving LSTM model performance. Overall, the presented method provides a powerful mechanism for improving the performance of deep neural networks, and; thus, we believe that it has significant potential for future applications in the artificial intelligence discipline.
  • Article
    Network anomaly detection using Deep Autoencoder and parallel Artificial Bee Colony algorithm-trained neural network
    (PEERJ INC, 2024) Hacilar, Hilal; Dedeturk, Bilge Kagan; Bakir-Gungor, Burcu; Gungor, Vehbi Cagr; 0000-0002-5811-6722; 0000-0002-2272-6270; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Hacilar, Hilal; Bakir-Gungor, Burcu
    Cyberattacks are increasingly becoming more complex, which makes intrusion detection extremely difficult. Several intrusion detection approaches have been developed in the literature and utilized to tackle computer security intrusions. Implementing machine learning and deep learning models for network intrusion detection has been a topic of active research in cybersecurity. In this study, artificial neural networks (ANNs), a type of machine learning algorithm, are employed to determine optimal network weight sets during the training phase. Conventional training algorithms, such as back- propagation, may encounter challenges in optimization due to being entrapped within local minima during the iterative optimization process; global search strategies can be slow at locating global minima, and they may suffer from a low detection rate. In the ANN training, the Artificial Bee Colony (ABC) algorithm enables the avoidance of local minimum solutions by conducting a high-performance search in the solution space but it needs some modifications. To address these challenges, this work suggests a Deep Autoencoder (DAE)-based, vectorized, and parallelized ABC algorithm for training feed-forward artificial neural networks, which is tested on the UNSW-NB15 and NF-UNSW-NB15-v2 datasets. Our experimental results demonstrate that the proposed DAE-based parallel ABC-ANN outperforms existing metaheuristics, showing notable improvements in network intrusion detection. The experimental results reveal a notable improvement in network intrusion detection through this proposed approach, exhibiting an increase in detection rate (DR) by 0.76 to 0.81 and a reduction in false alarm rate (FAR) by 0.016 to 0.005 compared to the ANN-BP algorithm on the UNSWNB15 dataset. Furthermore, there is a reduction in FAR by 0.006 to 0.0003 compared to the ANN-BP algorithm on the NF-UNSW-NB15-v2 dataset. These findings underscore the effectiveness of our proposed approach in enhancing network security against network intrusions.
  • Article
    An optimal concentric circular antenna array design using atomic orbital search for communication systems
    (Walter de Gruyter GmbH, 2024) Durmus, Ali; Yildirim, Zafer; Kurban, Rifat; Karakose, Ercan; 0000-0002-0277-2210; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Kurban, Rifat
    In this study, optimum radiation patterns of Concentric Circular Antenna Arrays (CCAAs) are obtained by using the Atomic Orbital Search (AOS) algorithm for communication spectrum. Communication systems stands as a nascent technological innovation poised to revolutionize the landscape of wireless communication systems. It distinguishes itself through its hallmark features, notably an exceptionally high data transmission rate, expanded network capacity, minimal latency, and a commendable quality of service. The most important issue in wireless communication is a precision antenna array design. The success of this design depends on suppressing the maximum sidelobe levels (MSLs) values of the antenna in the far-field radiation region as much as possible. The AOS, which is a rapid and flexible search algorithm, is a novel physics-based algorithm. The amplitudes and inter-element spacing of CCAAs are optimally determined by utilizing AOS to the reduction of the MSLs. In this study, CCAAs with three and four rings are considered. The number of elements of these CCAAs has been determined as 4-6-8, 8-10-12 and 6-12-18-24. The radiation patterns obtained with AOS are compared with the results available in the literature and it is seen that the results of the AOS method are better.
  • Article
    Ensemble Feature Selection for Clustering Damage Modes in Carbon Fiber-Reinforced Polymer Sandwich Composites Using Acoustic Emission
    (John Wiley and Sons Inc, 2024) Gulsen, Abdulkadir; Kolukisa, Burak; Caliskan, Umut; Bakir-Gungor, Burcu; Gungor, Vehbi Cagri; 0000-0002-4250-2880; 0000-0003-0423-4595; 0000-0002-2272-6270; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Gulsen, Abdulkadir; Kolukisa, Burak; Bakir-Gungor, Burcu
    Acoustic emission (AE) serves as a noninvasive technique for real-time structural health monitoring, capturing the stress waves produced by the formation and growth of cracks within a material. This study presents a novel ensemble feature selection methodology to rank features highly relevant with damage modes in AE signals gathered from edgewise compression tests on honeycomb-core carbon fiber-reinforced polymer. Two distinct features, amplitude and peak frequency, are selected for labeling the AE signals. An ensemble-supervised feature selection method ranks feature importance according to these labels. Using the ranking list, unsupervised clustering models are then applied to identify damage modes. The comparative results reveal a robust correlation between the damage modes and the features of counts and energy when amplitude is selected. Similarly, when peak frequency is chosen, a significant association is observed between the damage modes and the features of partial powers 1 and 2. These findings demonstrate that, in addition to the commonly used features, other features, such as partial powers, exhibit a correlation with damage modes.
  • Article
    CCPred: Global and population-specific colorectal cancer prediction and metagenomic biomarker identification at different molecular levels using machine learning techniques
    (ELSEVIER, 2024) Bakir-Gungor, Burcu; Temiz, Mustafa; Inal, Yasin; Cicekyurt, Emre; Yousef, Malik; 0000-0002-2272-6270; 0000-0002-2839-1424; 0009-0002-4373-8526; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Bakir-Gungor, Burcu; Temiz, Mustafa; Inal, Yasin; Cicekyurt, Emre
    Colorectal cancer (CRC) ranks as the third most common cancer globally and the second leading cause of cancer-related deaths. Recent research highlights the pivotal role of the gut microbiota in CRC development and progression. Understanding the complex interplay between disease development and metagenomic data is essential for CRC diagnosis and treatment. Current computational models employ machine learning to identify metagenomic biomarkers associated with CRC, yet there is a need to improve their accuracy through a holistic biological knowledge perspective. This study aims to evaluate CRC-associated metagenomic data at species, enzymes, and pathway levels via conducting global and population-specific analyses. These analyses utilize relative abundance values from human gut microbiome sequencing data and robust classification models are built for disease prediction and biomarker identification. For global CRC prediction and biomarker identification, the features that are identified by SelectKBest (SKB), Information Gain (IG), and Extreme Gradient Boosting (XGBoost) methods are combined. Population-based analysis includes within-population, leave-one-dataset-out (LODO) and cross-population approaches. Four classification algorithms are employed for CRC classification. Random Forest achieved an AUC of 0.83 for species data, 0.78 for enzyme data and 0.76 for pathway data globally. On the global scale, potential taxonomic biomarkers include ruthenibacterium lactatiformanas; enzyme biomarkers include RNA 2′ 3′ cyclic 3′ phosphodiesterase; and pathway biomarkers include pyruvate fermentation to acetone pathway. This study underscores the potential of machine learning models trained on metagenomic data for improved disease prediction and biomarker discovery. The proposed model and associated files are available at https://github.com/TemizMus/CCPRED.
  • Article
    Lifetime maximization of IoT-enabled smart grid applications using error control strategies
    (ELSEVIER, 2024) Tekin, Nazli; Dedeturk, Bilge Kagan; Gungor, Vehbi Cagri; 0000-0003-0803-8372; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Gungor, Vehbi Cagri
    Recently, with the advancement of Internet of Things (IoT) technology, IoT-enabled Smart Grid (SG) applications have gained tremendous popularity. Ensuring reliable communication in IoT-based SG applications is challenging due to the harsh channel environment often encountered in the power grid. Error Control (EC) techniques have emerged as a promising solution to enhance reliability. Nevertheless, ensuring network reliability requires a substantial amount of energy consumption. In this paper, we formulate a Mixed Integer Programming (MIP) model which considers the energy dissipation of EC techniques to maximize IoT network lifetime while ensuring the desired level of IoT network reliability. We develop meta-heuristic approaches such as Artificial Bee Colony (ABC) and Particle Swarm Optimization (PSO) to address the high computation complexity of large-scale IoT networks. Performance evaluations indicate that the EC-Node strategy, where each IoT node employs the most energy-efficient EC technique, yields a minimum of 8.9% extended lifetimes compared to the EC-Net strategies, where all IoT nodes employ the same EC method for a communication. Moreover, the PSO algorithm reduces the computational time by 77% while exhibiting a 2.69% network lifetime decrease compared to the optimal solution.
  • Article
    EdgeBus: Co-Simulation based resource management for heterogeneous mobile edge computing environments
    (ELSEVIER, 2024) Ali, Babar; Golec, Muhammed; Gill, Sukhpal Singh; Wu, Huaming; Cuadrado, Felix; Uhlig, Steve; AGÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü; Golec, Muhammed
    Kubernetes has revolutionized traditional monolithic Internet of Things (IoT) applications into lightweight, decentralized, and independent microservices, thus becoming the de facto standard in the realm of container orchestration. Intelligent and efficient container placement in Mobile Edge Computing (MEC) is challenging subjected to user mobility, and surplus but heterogeneous computing resources. One solution to constantly altering user location is to relocate containers closer to the user; however, this leads to additional underutilized active nodes and increases migration's computational overhead. On the contrary, few to no migrations are attributed to higher latency, thus degrading the Quality of Service (QoS). To tackle these challenges, we created a framework named EdgeBus1, which enables the co-simulation of container resource management in heterogeneous MEC environments based on Kubernetes. It enables the assessment of the impact of container migrations on resource management, energy, and latency. Further, we propose a mobility and migration cost-aware (MANGO) lightweight scheduler for efficient container management by incorporating migration cost, CPU cores, and memory usage for container scheduling. For user mobility, the Cabspotting dataset is employed, which contains real-world traces of taxi mobility in San Francisco. In the EdgeBus framework, we have created a simulated environment aided with a real-world testbed using Google Kubernetes Engine (GKE) to measure the performance of the MANGO scheduler in comparison to baseline schedulers such as IMPALA-based MobileKube, Latency Greedy, and Binpacking. Finally, extensive experiments have been conducted, which demonstrate the effectiveness of the MANGO in terms of latency and number of migrations.