Extraction of Text Summarization
Keywords:
Text Summarization, LSTM, BART, Llama, NLP, Abstractive Summarization, Extractive Summarization, Deep Learning, CNN/DailyMail, Information RetrievalAbstract
Text summarization is an essential task in natural language processing that condenses large volumes of text into concise summaries, helping users grasp critical information efficiently. This project aims to leverage deep learning models—LSTM, Llama, and BART—on the CNN/Daily Mail dataset to generate high-quality summaries that capture key elements from news articles. By combining these models, we explore both extractive and abstractive summarization methods, optimizing them to produce coherent, human-like summaries. The LSTM model enables sequential understanding of text, while Llama and BART bring transformer-based approaches for handling complex language structures. This ensemble approach seeks to balance summarization accuracy with semantic preservation, ensuring readability and information retention. The project outcomes are expected to improve information accessibility in various applications, from news aggregation to academic and industry research.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2025 International Journal of Scientific Research in Science, Engineering and Technology

This work is licensed under a Creative Commons Attribution 4.0 International License.