File size: 657 Bytes
4d465af
 
 
3f76956
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
---
license: mit
---

Base model: [gpt2-large](https://huggingface.co/gpt2-large)

Fine-tuned to generate responses on a dataset of [Vaccine public health tweets](https://github.com/TheRensselaerIDEA/generative-response-modeling). For more information about the dataset, task and training, see [our paper](https://arxiv.org/abs/2204.04353). This checkpoint corresponds to the lowest validation perplexity (2.82 at 2 epochs) seen during training. See Training metrics for Tensorboard logs.

For input format and usage examples, see our [COVID-19 public health tweet response model](https://huggingface.co/TheRensselaerIDEA/gpt2-large-covid-tweet-response).