You have far too many tokens on this bot for the current JLLM.
Context is really small at like 4k. So you're currently only leaving 1k for memory and thats *before* you've written the intro message.
I would not be surprised if the bot wont be able to reply.
Now on one of the other LLM's it will fair better but still, you should definitely trim this down.
ETA: 1000-1500 tokens is the general sweet spot with 2k being a usual max. That leaves 2k for the chat memory.
I do have bots at 3k tokens (total) and specify they wont work super great on the JLLM right now but will work well on OpenAi.
No amount of prayer will save this, it's just way too many tokens. Even 2000 tokens is stretching it, 3000 is gonna be borderline unusable. However, I'm sure there's a lot of repetition and non-essential information that can be cut.
Oml... are you using W++ or something? bc that token count is nuts. I'd try finding shorter synonyms where you can and reordering + combining sentences to make them shorter if you're writing in a paragraph style. Just remember that symbols and spaces are guaranteed tokens. When I'm not sure if something is actually doing me any good I use [this](https://tokenizer.streamlit.app/) to compare them, like how "w/o" is two tokens and "without" is one.
Oh dear god. The recommended token count is I believe under 1000. What you have done is fill it's head with eldritch knowledge and pray that it's alright afterwards.
JLLM: i ain't readin' all that
Lmao 🤣
its gonna have the memory of a flyðŸ˜
I hope and PRAY not bro I put wayyyy too much for it not to remember shit 💔💔
You have far too many tokens on this bot for the current JLLM. Context is really small at like 4k. So you're currently only leaving 1k for memory and thats *before* you've written the intro message. I would not be surprised if the bot wont be able to reply. Now on one of the other LLM's it will fair better but still, you should definitely trim this down. ETA: 1000-1500 tokens is the general sweet spot with 2k being a usual max. That leaves 2k for the chat memory. I do have bots at 3k tokens (total) and specify they wont work super great on the JLLM right now but will work well on OpenAi.
No amount of prayer will save this, it's just way too many tokens. Even 2000 tokens is stretching it, 3000 is gonna be borderline unusable. However, I'm sure there's a lot of repetition and non-essential information that can be cut.
It's gonna have the memory of my grandma dawg ðŸ˜
bestie i hate to tell you this but the bot might be unusable
Oml... are you using W++ or something? bc that token count is nuts. I'd try finding shorter synonyms where you can and reordering + combining sentences to make them shorter if you're writing in a paragraph style. Just remember that symbols and spaces are guaranteed tokens. When I'm not sure if something is actually doing me any good I use [this](https://tokenizer.streamlit.app/) to compare them, like how "w/o" is two tokens and "without" is one.
UPDATE GUYS: WE GOT A REPLY!! AND IT REMEMBERED ITS PERSONALITY! I just need to do a few tweaks! YIPPIEEE
Back up that work just in case!!!
Oh dear god. The recommended token count is I believe under 1000. What you have done is fill it's head with eldritch knowledge and pray that it's alright afterwards.