Coming soon - Get a detailed view of why an account is flagged as spam!
view details

This post has been de-listed

It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.

13
Character and prompt shortening
Post Flair (click to view more posts with a particular flair)
Post Body

Hey. I was experimenting with shortening my JSON format character definitions with LLM. I basically asked it to keep the formatting, keep all the information intact but condense it to save tokens.

The results were between 20-30% reduction of tokens of a final card. Usually around 800 tokens from 1000 tokens at start and 900-1000 tokens from 1300-1500 tokens at start. It may be possible shortening it even further by removing quotation markdowns from a JSON format but different LLMs react to it differently.

Conclusion remains that LLM is able to shorten your prompts while preserving information by 20-30%.

I am not personally limited with context as I'm using 32-64k context usually so saving 200-500 tokens becomes one-two messages of roleplay - but - with group chats it starts giving higher returns LLMs seem to work best with cards below 1000 tokens. Less often feels better with LLMs, less tokens results in better consistency and responses since LLMs have to keep track of just concrete information, stripped off colorful, human clutter we have a tendency to put in our texts.

Author
Account Strength
60%
Account Age
11 months
Verified Email
Yes
Verified Flair
No
Total Karma
1,845
Link Karma
191
Comment Karma
1,654
Profile updated: 5 days ago
Posts updated: 9 hours ago

Subreddit

Post Details

We try to extract some basic information from the post title. This is not always successful or accurate, please use your best judgement and compare these values to the post title and body for confirmation.
Posted
1 month ago