amarahiqbal commited on
Commit
d3192a8
1 Parent(s): 55f6a76

I have updated the correct case for Mpt and MptConfig.

Browse files

I have updated the correct case for AutoModelForCausalLM and AutoConfig.
In the transformer version it is class transformers.MptConfig and transformers.MptForCausalLM. So updated the correct case

Files changed (1) hide show
  1. config.json +2 -2
config.json CHANGED
@@ -16,8 +16,8 @@
16
  "softmax_scale": null
17
  },
18
  "auto_map": {
19
- "AutoConfig": "mosaicml/mpt-7b--configuration_mpt.MPTConfig",
20
- "AutoModelForCausalLM": "mosaicml/mpt-7b--modeling_mpt.MPTForCausalLM"
21
  },
22
  "d_model": 128,
23
  "emb_pdrop": 0,
 
16
  "softmax_scale": null
17
  },
18
  "auto_map": {
19
+ "AutoConfig": "mosaicml/mpt-7b--configuration_mpt.MptConfig",
20
+ "AutoModelForCausalLM": "mosaicml/mpt-7b--modeling_mpt.MptForCausalLM"
21
  },
22
  "d_model": 128,
23
  "emb_pdrop": 0,