Nepali News Headline Generation using mBART Model

Authors

  • Bibek Prasad Paneru
  • Mohit Budhathoki
  • Sumit Panta
  • Arudhi Bohora
  • Divya Bhattarai
  • Sharmila Bista

Keywords:

Digital Journalism, Fine-tuning, Flutter, LoRA, mBART, ROUGE, Self-attention Mechanism, Social Media, Web- scraping

Abstract

With the rapid increase in digital news consumption, generating concise and informative headlines has become essential in the present world. Social media users are increasingly getting news from their respective platforms. This study uses
mBART, a multilingual transformer-based model, to automatically generate headlines for Nepali news articles. For effective fine-tuning, the model was trained on 86,628 news articles scraped from various Nepali news portals, utilizing a sequence-to-sequence architecture and LoRA. By using a self-attention mechanism, the model captures more context and performs better than conventional methods. To guarantee data quality, it was first subjected to filtering, pre-processing, and tokenization. The model performed well in capturing the structure and relevance of the content, as evidenced by its 0.4545 ROUGE scores. Flutter was used to create an intuitive user interface that made it possible to input and view generated headlines with ease. Social networks, content aggregation platforms, and news portals can all incorporate this research. The work promotes automation in digital journalism and advances Nepali natural language
processing. 

Downloads

Published

2026-04-02