DrNicefellow's picture
Update README.md
4976cc9 verified
metadata
license: apache-2.0

Mixtral-6x7B-Instruct-v0.1 (bfloat16)

The Mixtral-6x7B-Instruct-v0.1 model is a derivative of the mistralai/Mixtral-8x7B-Instruct-v0.1 model. It was created by selectively trimming the original model and retaining only the 0th, 2nd, 4th, 5th, 6th, and 7th experts from each layer.

The trimming process was facilitated by the Mixtral-Expert-Trimmer tool, developed specifically for this purpose.

The model is still in testing phase. It is not clear whether it works.

License

The Mixtral-6x7B-Instruct-v0.1 model is open-source and licensed under the Apache 2.0 License. For more information, please refer to the LICENSE file.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?