|
--- |
|
language: ja |
|
license: mit |
|
--- |
|
|
|
# Japanese DistilBERT Pretrained Model |
|
A Japanese DistilBERT pretrained model, which was trained on [Wikipedia](https://ja.wikipedia.org/). |
|
|
|
Find [here](https://github.com/BandaiNamcoResearchInc/DistilBERT-base-jp/blob/master/docs/GUIDE.md) for a quickstart guidance in Japanese. |
|
|
|
## Table of Contents |
|
|
|
1. [Introduction](#Introduction) |
|
1. [Requirements](#Requirements) |
|
1. [Usage](#Usage) |
|
1. [License](#License) |
|
|
|
## Introduction |
|
DistilBERT is a small, fast, cheap and light Transformer model based on Bert architecture. It has 40% less parameters than BERT-base, runs 60% faster while preserving 97% of BERT's performance as measured on the GLUE language understanding benchmark. |
|
|
|
This model was trained with the official Hugging Face implementation from [here](https://github.com/huggingface/transformers/tree/master/examples/distillation) for 2 weeks on AWS p3dn.24xlarge instance. |
|
|
|
More details about distillation can be found in following paper. |
|
["DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter"](https://arxiv.org/abs/1910.01108) by Sanh et al. (2019). |
|
|
|
The teacher model is [the pretrained Japanese BERT models from TOHOKU NLP LAB](https://www.nlp.ecei.tohoku.ac.jp/news-release/3284/). |
|
|
|
Currently only PyTorch compatible weights are available. Tensorflow checkpoints can be generated by following the [official guide](https://github.com/huggingface/transformers). |
|
|
|
## Requirements |
|
|
|
``` |
|
torch>=1.3.1 |
|
torchvision>=0.4.2 |
|
transformers>=2.5.0 |
|
tensorboard>=1.14.0 |
|
tensorboardX==1.8 |
|
scikit-learn>=0.21.0 |
|
mecab-python3 |
|
``` |
|
|
|
## Usage |
|
|
|
### Download model |
|
|
|
Please download and unzip [DistilBERT-base-jp.zip](https://github.com/BandaiNamcoResearchInc/DistilBERT-base-jp/releases). |
|
|
|
### Use model |
|
|
|
```python |
|
# Read from local path |
|
from transformers import AutoModel, AutoTokenizer |
|
tokenizer = AutoTokenizer.from_pretrained("bert-base-japanese-whole-word-masking") |
|
model = AutoModel.from_pretrained("LOCAL_PATH") |
|
``` |
|
|
|
LOCAL_PATH means the path which above file is unzipped. 3 files should be included: |
|
|
|
- pytorch_model.bin |
|
- config.json |
|
- vocal.txt |
|
|
|
or |
|
```python |
|
# Download from model library from huggingface.co |
|
from transformers import AutoModel, AutoTokenizer |
|
tokenizer = AutoTokenizer.from_pretrained("bert-base-japanese-whole-word-masking") |
|
model = AutoModel.from_pretrained("bandainamco-mirai/distilbert-base-japanese") |
|
``` |
|
|
|
## License |
|
Copyright (c) 2020 BANDAI NAMCO Research Inc. |
|
|
|
Released under the MIT license |
|
|
|
https://opensource.org/licenses/mit-license.php |
|
|