32k Context?

#1
by swiftwater - opened

Hey! I'm a noob to all this, love the model. Was wondering if it will run 32k context length? I get some weird responses when context seems to go above 12k. I'm running a 3090ti with 24gb, and memory utilization doesnt max out with it. Sorry, I dont know my ass from a hole in the ground with this, but your stuff is awesome.

Not sure how well this model works with large contexts. I believe this is a Solar-based model. According to the config.json, it'll do 4k context. But, you may be able to extend this more changing the ROPE settings. I don't have extensive experience using this model with longer contexts, so you might want to hit up TheBloke's discord and ask the folks there for advice.

Right on! Thank you!

Sign up or log in to comment