Multi-Method Text Summarization: Evaluating Extractive and BART-Based Approaches on CNN/Daily Mail
No Thumbnail Available
Date
2025
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers Inc.
Open Access Color
Green Open Access
No
OpenAIRE Downloads
OpenAIRE Views
Publicly Funded
No
Abstract
With the exponential growth of digital content, efficient text summarization has become increasingly crucial for managing information overload. This paper presents a comprehensive approach to text summarization using both extractive and abstractive methods, implemented on the CNN/Daily Mail dataset. We leverage pre-trained BART (Bidirectional and AutoRegressive Transformers) models and fine-tuning techniques to generate high-quality summaries. Our approach demonstrates significant improvements, with our best model trained on 287 k samples achieving ROUGE-1 F1 scores of 0.4174, ROUGE-2 F1 scores of 0.1932, and ROUGE-L F1 scores of 0.2910. We provide detailed comparisons between extractive methods and various BART model configurations, analyzing the impact of training dataset size and model architecture on summarization quality. Additionally, we share our implementation through an opensource NLP toolkit to facilitate further research and practical applications in the field. © 2025 Elsevier B.V., All rights reserved.
Description
Keywords
Abstractive Summarization, Bart, Deep Learning, Extractive Summarization, Natural Language Processing, Text Summarization, Abstracting, Data Mining, Natural Language Processing Systems, Text Processing, Abstractive Summarization, Auto-Regressive, Bidirectional and Autoregressive Transformer, Deep Learning, Extractive Summarizations, F1 Scores, Language Processing, Natural Language Processing, Natural Languages, Text Summarisation
Turkish CoHE Thesis Center URL
Fields of Science
Citation
WoS Q
N/A
Scopus Q
N/A

OpenCitations Citation Count
N/A
Source
-- 9th International Symposium on Innovative Approaches in Smart Technologies, ISAS 2025 -- Gaziantep -- 211342
Volume
Issue
Start Page
1
End Page
7
Collections
PlumX Metrics
Citations
Scopus : 0
Captures
Mendeley Readers : 4
Page Views
3
checked on Feb 03, 2026
Google Scholar™


