Inal, YasinBakal, GokhanEsit, Muhammed2025-09-252025-09-2520259798331514822https://doi.org/10.1109/ISAS66241.2025.11101791https://hdl.handle.net/20.500.12573/4242With the exponential growth of digital content, efficient text summarization has become increasingly crucial for managing information overload. This paper presents a comprehensive approach to text summarization using both extractive and abstractive methods, implemented on the CNN/Daily Mail dataset. We leverage pre-trained BART (Bidirectional and AutoRegressive Transformers) models and fine-tuning techniques to generate high-quality summaries. Our approach demonstrates significant improvements, with our best model trained on 287 k samples achieving ROUGE-1 F1 scores of 0.4174, ROUGE-2 F1 scores of 0.1932, and ROUGE-L F1 scores of 0.2910. We provide detailed comparisons between extractive methods and various BART model configurations, analyzing the impact of training dataset size and model architecture on summarization quality. Additionally, we share our implementation through an opensource NLP toolkit to facilitate further research and practical applications in the field. © 2025 Elsevier B.V., All rights reserved.eninfo:eu-repo/semantics/closedAccessAbstractive SummarizationBartDeep LearningExtractive SummarizationNatural Language ProcessingText SummarizationAbstractingData MiningNatural Language Processing SystemsText ProcessingAbstractive SummarizationAuto-RegressiveBidirectional and Autoregressive TransformerDeep LearningExtractive SummarizationsF1 ScoresLanguage ProcessingNatural Language ProcessingNatural LanguagesText SummarisationMulti-Method Text Summarization: Evaluating Extractive and BART-Based Approaches on CNN/Daily MailConference Object10.1109/ISAS66241.2025.111017912-s2.0-105014935690