File size: 2,570 Bytes
ed066e8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 |
---
language: ja
license: mit
---
# Japanese DistilBERT Pretrained Model
A Japanese DistilBERT pretrained model, which was trained on [Wikipedia](https://ja.wikipedia.org/).
Find [here](https://github.com/BandaiNamcoResearchInc/DistilBERT-base-jp/blob/master/docs/GUIDE.md) for a quickstart guidance in Japanese.
## Table of Contents
1. [Introduction](#Introduction)
1. [Requirements](#Requirements)
1. [Usage](#Usage)
1. [License](#License)
## Introduction
DistilBERT is a small, fast, cheap and light Transformer model based on Bert architecture. It has 40% less parameters than BERT-base, runs 60% faster while preserving 97% of BERT's performance as measured on the GLUE language understanding benchmark.
This model was trained with the official Hugging Face implementation from [here](https://github.com/huggingface/transformers/tree/master/examples/distillation) for 2 weeks on AWS p3dn.24xlarge instance.
More details about distillation can be found in following paper.
["DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter"](https://arxiv.org/abs/1910.01108) by Sanh et al. (2019).
The teacher model is [the pretrained Japanese BERT models from TOHOKU NLP LAB](https://www.nlp.ecei.tohoku.ac.jp/news-release/3284/).
Currently only PyTorch compatible weights are available. Tensorflow checkpoints can be generated by following the [official guide](https://github.com/huggingface/transformers).
## Requirements
```
torch>=1.3.1
torchvision>=0.4.2
transformers>=2.5.0
tensorboard>=1.14.0
tensorboardX==1.8
scikit-learn>=0.21.0
mecab-python3
```
## Usage
### Download model
Please download and unzip [DistilBERT-base-jp.zip](https://github.com/BandaiNamcoResearchInc/DistilBERT-base-jp/releases).
### Use model
```python
# Read from local path
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-japanese-whole-word-masking")
model = AutoModel.from_pretrained("LOCAL_PATH")
```
LOCAL_PATH means the path which above file is unzipped. 3 files should be included:
- pytorch_model.bin
- config.json
- vocal.txt
or
```python
# Download from model library from huggingface.co
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-japanese-whole-word-masking")
model = AutoModel.from_pretrained("bandainamco-mirai/distilbert-base-japanese")
```
## License
Copyright (c) 2020 BANDAI NAMCO Research Inc.
Released under the MIT license
https://opensource.org/licenses/mit-license.php
|