Skip to content

Revolutionize text summarization with this Transformer model, leveraging state-of-the-art techniques. Trained on news articles, it produces concise summaries effortlessly. Explore cutting-edge capabilities for your summarization needs.

Notifications You must be signed in to change notification settings

GvHemanth/Transformers-based-Text-Summarization

Repository files navigation

Transformer-based Text Summarization

Overview:

This project implements a text summarization model using the Transformer architecture. The model is trained on a dataset of news articles and their corresponding headlines. The goal is to generate concise and relevant summaries for given input articles.

Prerequisites:

  • Python 3.x
  • TensorFlow 2.x
  • Openpyxl

Installation:

  1. Clone the repository:
git clone https://github.com/GVHemanth/Transformers-based-Text-Summarization.git
cd transformer-text-summarization
  1. Install dependencies:
pip install openpyxl --quiet
  1. Download the dataset:

    • Download the dataset (news articles and headlines) and place it in the appropriate directory. Ensure the file is in the required format (e.g., Inshorts Cleaned Data.xlsx).

Usage:

  1. Run the main notebook:
python transformer_summarization.ipynb
  1. Monitor training progress:

    • The script will train the Transformer model for a specified number of epochs. You can adjust hyperparameters in the script or through command-line arguments.
  2. Evaluate the model:

    • After training, the script provides an example of how to use the trained model to generate summaries for new input articles.

Customization:

  • Adjust hyperparameters such as num_layers, d_model, num_heads, dff, and dropout_rate in the script to experiment with different configurations.

  • Modify the training loop to suit your specific use case or integrate the model into your applications.

Results:

  • The model's performance can be evaluated by comparing the generated summaries with the actual headlines from the dataset.

Model Checkpoints:

  • Model checkpoints will be saved during training in the "checkpoints" directory. You can use these checkpoints to restore the trained model.

Contributing:

  • Contributions are welcome! Feel free to submit issues or pull requests.

License:

  • This project is licensed under the MIT License - see the LICENSE file for details.

About

Revolutionize text summarization with this Transformer model, leveraging state-of-the-art techniques. Trained on news articles, it produces concise summaries effortlessly. Explore cutting-edge capabilities for your summarization needs.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published