TransformerSum

GitHub license Github commits made-with-python Documentation Status GitHub issues GitHub pull-requests DeepSource

TransformerSum is a library that aims to make it easy to train, evaluate, and use machine learning transformer models that perform automatic summarization.

It features tight integration with huggingface/transformers which enables the easy usage of a wide variety of architectures and pre-trained models.

There is a heavy emphasis on code readability and interpretability so that both beginners and experts can build new components. Both the extractive and abstractive model classes are written using pytorch_lightning, which handles the PyTorch training loop logic, enabling easy usage of advanced features such as 16-bit precision, multi-GPU training, and much more.

TransformerSum supports both the extractive and abstractive summarization of long sequences (4,096 to 16,000 tokens) using the longformer (extractive) and LongformerEncoderDecoder (abstractive), which is a combination of BART (paper) and the longformer. TransformerSum also contains models that can run on resource-limited devices while still maintaining high levels of accuracy.

Models are automatically evaluated with the ROUGE metric but human tests can be conducted by the user.

Check out the documentation for usage details.

Project link: https://github.com/HHousen/TransformerSum