Textual Content Augmentation
Summary Generation

Summary Generation

This module has the ability to transform lengthy paragraphs into concise and informative summaries.


  • Usage: You can manually trigger summary generation in real-time via API requests, specifying the summary filter in the _additional field.
  • Benefit: This module offers you the flexibility to apply summarization as and when required. Thus it helps in conserving computational resources.


  • Usage: If you choose automatic, summaries are generated and stored as content is ingested. These summaries are available instantly in the summary field.
  • Benefit: It enhances data retrieval speed with pre-generated summaries and also ensures immediate access.
Model NameTraining DatasetPrimary ApplicationLanguageDescription
bart-large-cnnCNN/Daily MailText SummarizationEnglishA BART model pre-trained on a large corpus and fine-tuned on the CNN/Daily Mail dataset for abstractive text summarization. It’s capable of generating coherent, concise summaries.
pegasus-xsumXSumText SummarizationEnglishPEGASUS (Pre-training with Extracted Gap-sentences for Abstractive SUmmarization Sequence-to-sequence models) is specifically designed for abstractive text summarization, and this variant is fine-tuned on the XSum dataset, aiming to generate informative and concise summaries.