pyGPT-50M / README.md
Danil
Update README.md
36d3658
|
raw
history blame
191 Bytes
metadata
license: mpl-2.0
language:
  - en
  - code

PythonGPT

A GPT2-type neural network trained on 4 gigabytes of Pyhon scripts from scratch. It has 50 million parameters.

Made as a toy.