license: mpl-2.0 language: - en - code
A GPT2-type neural network trained on 4 gigabytes of Pyhon scripts from scratch. It has 50 million parameters.
Made as a toy.