Towards efficient knowledge extraction: Natural language processing-based summarization of research paper introductions
Abstract
Academic and research papers serve as valuable platforms for disseminating expertise and discoveries to diverse audiences. The growing volume of academic papers, with nearly 7 million new publications annually, presents a formidable challenge for students and researchers alike. Consequently, the development of research paper summarization tools has become crucial to distilling crucial insights efficiently. This study examines the effectiveness of pre-trained models like text-to-text transfer transformer (T5), bidirectional encoder representations from transformers (BERT), bidirectional and auto-regressive transformer (BART), and pre-training with extracted gap-sentences for abstractive summarization (PEGASUS) on research papers, introducing a novel hybrid model merging extractive and abstractive techniques. Comparative analysis of summaries, recall-oriented understudy for gisting evaluation (ROUGE) and bilingual evaluation understudy (BLEU) score evaluations and author evaluation help evaluate the quality and accuracy of the generated summaries. This advancement contributes to enhancing the accessibility and efficiency of assimilating complex academic content, emphasizing the importance of advanced summarization tools in promoting the accessibility of academic knowledge.
Keywords
BART; BERT; Natural language processing; PEGASUS; Text summarization; Text-to-text transfer transformer
Full Text:
PDFDOI: http://doi.org/10.11591/ijai.v14.i1.pp680-691
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).