huggingface-transformers/notebooks
Patrick von Platen f744b81572
add new notebooks (#8246)
2020-11-02 20:21:55 +01:00
..
01-training-tokenizers.ipynb
02-transformers.ipynb [tokenizers] Updates data processors, docstring, examples and model cards to the new API (#5308) 2020-06-26 19:48:14 +02:00
03-pipelines.ipynb Fixed open in colab link (#6825) 2020-08-30 18:21:00 +08:00
04-onnx-export.ipynb Update ONNX notebook to include section on quantization. (#6831) 2020-08-31 21:28:00 +02:00
05-benchmark.ipynb [bart] fix config.classif_dropout (#7593) 2020-10-06 11:33:51 -04:00
README.md add new notebooks (#8246) 2020-11-02 20:21:55 +01:00

README.md

🤗 Transformers Notebooks

You can find here a list of the official notebooks provided by Hugging Face.

Also, we would like to list here interesting content created by the community. If you wrote some notebook(s) leveraging 🤗 Transformers and would like be listed here, please open a Pull Request so it can be included under the Community notebooks.

Hugging Face's notebooks 🤗

Notebook Description
Getting Started Tokenizers How to train and use your very own tokenizer Open In Colab
Getting Started Transformers How to easily start using transformers Open In Colab
How to use Pipelines Simple and efficient way to use State-of-the-Art models on downstream tasks through transformers Open In Colab
How to train a language model Highlight all the steps to effectively train Transformer model on custom data Open in Colab
How to generate text How to use different decoding methods for language generation with transformers Open in Colab
How to export model to ONNX Highlight how to export and run inference workloads through ONNX
How to use Benchmarks How to benchmark models with transformers Open in Colab
Reformer How Reformer pushes the limits of language modeling Open in Colab

Community notebooks:

Notebook Description Author
Train T5 in Tensorflow 2 How to train T5 for any task using Tensorflow 2. This notebook demonstrates a Question & Answer task implemented in Tensorflow 2 using SQUAD Muhammad Harris Open In Colab
Train T5 on TPU How to train T5 on SQUAD with Transformers and Nlp Suraj Patil Open In Colab
Fine-tune T5 for Classification and Multiple Choice How to fine-tune T5 for classification and multiple choice tasks using a text-to-text format with PyTorch Lightning Suraj Patil Open In Colab
Fine-tune DialoGPT on New Datasets and Languages How to fine-tune the DialoGPT model on a new dataset for open-dialog conversational chatbots Nathan Cooper Open In Colab
Long Sequence Modeling with Reformer How to train on sequences as long as 500,000 tokens with Reformer Patrick von Platen Open In Colab
Fine-tune BART for Summarization How to fine-tune BART for summarization with fastai using blurr Wayde Gilliam Open In Colab
Fine-tune a pre-trained Transformer on anyone's tweets How to generate tweets in the style of your favorite Twitter account by fine-tune a GPT-2 model Boris Dayma Open In Colab
A Step by Step Guide to Tracking Hugging Face Model Performance A quick tutorial for training NLP models with HuggingFace and & visualizing their performance with Weights & Biases Jack Morris Open In Colab
Pretrain Longformer How to build a "long" version of existing pretrained models Iz Beltagy Open In Colab
Fine-tune Longformer for QA How to fine-tune longformer model for QA task Suraj Patil Open In Colab
Evaluate Model with 🤗nlp How to evaluate longformer on TriviaQA with nlp Patrick von Platen Open In Colab
Fine-tune T5 for Sentiment Span Extraction How to fine-tune T5 for sentiment span extraction using a text-to-text format with PyTorch Lightning Lorenzo Ampil Open In Colab
Fine-tune DistilBert for Multiclass Classification How to fine-tune DistilBert for multiclass classification with PyTorch Abhishek Kumar Mishra Open In Colab
Fine-tune BERT for Multi-label Classification How to fine-tune BERT for multi-label classification using PyTorch Abhishek Kumar Mishra Open In Colab
Fine-tune T5 for Summarization How to fine-tune T5 for summarization in PyTorch and track experiments with WandB Abhishek Kumar Mishra Open In Colab
Speed up Fine-Tuning in Transformers with Dynamic Padding / Bucketing How to speed up fine-tuning by a factor of 2 using dynamic padding / bucketing Michael Benesty Open In Colab
Pretrain Reformer for Masked Language Modeling How to train a Reformer model with bi-directional self-attention layers Patrick von Platen Open In Colab
Expand and Fine Tune Sci-BERT How to increase vocabulary of a pretrained SciBERT model from AllenAI on the CORD dataset and pipeline it. Tanmay Thakur Open In Colab
Fine-tune Electra and interpret with Integrated Gradients How to fine-tune Electra for sentiment analysis and interpret predictions with Captum Integrated Gradients Eliza Szczechla Open In Colab
fine-tune a non-English GPT-2 Model with Trainer class How to fine-tune a non-English GPT-2 Model with Trainer class Philipp Schmid Open In Colab
Fine-tune a DistilBERT Model for Multi Label Classification task How to fine-tune a DistilBERT Model for Multi Label Classification task Dhaval Taunk Open In Colab
Fine-tune ALBERT for sentence-pair classification How to fine-tune an ALBERT model or another BERT-based model for the sentence-pair classification task Nadir El Manouzi Open In Colab
Fine-tune Roberta for sentiment analysis How to fine-tune an Roberta model for sentiment analysis Dhaval Taunk Open In Colab
Evaluating Question Generation Models How accurate are the answers to questions generated by your seq2seq transformer model? Pascal Zoleko Open In Colab
Classify text with DistilBERT and Tensorflow How to fine-tune DistilBERT for text classification in TensorFlow Peter Bayerle Open In Colab
Leverage BERT for Encoder-Decoder Summarization on CNN/Dailymail How to warm-start a EncoderDecoderModel with a bert-base-uncased checkpoint for summarization on CNN/Dailymail Patrick von Platen Open In Colab
Leverage RoBERTa for Encoder-Decoder Summarization on BBC XSum How to warm-start a shared EncoderDecoderModel with a roberta-base checkpoint for summarization on BBC/XSum Patrick von Platen Open In Colab