If you don't mind, I'm trying to experiment w/ local models more.
Just now getting into messing w/ these but I'm struggling to come up w/ good use cases.
Would you happen to know of any cool OSS model projects that might be good inspiration for a side project?
Wondering what most people use these local models for
I've got a couple I've done and it's been really enjoyable.
I think the real value in using local models is exposing them to personal/unique information that only you have, thus getting novel and unique outcomes that no public model could provide.
1. Project 1 - Self Knowledge
- Download/extract all of my emails and populate into a vector database, like Chroma[0]
- For each prompt do a search of the vector store and return N number of matches
- Provide both prompt and search result to LLM, instructing it to use the search result as context or in the answer itself.
2. Project 2 - Chat with a Friend
- I exported the chat and text history between me and a good friend that passed away
- I created a vector store of our chat history in chunks, each consisting of 6 back-and-forth interactions
- When I "chat" with the LLM the a search is first conducted for matching chunks from the vector store and then using those as "style" and knowledge context for a response.
Optional: You can use SillyTavern[1] for a more "rich" chat experience
The above lets me chat, at least superficially, with my friend. It's nice for simple interactions and banter; I've found it to be a positive and reflective experience.
No ideas about side projects or anything "productive" but for a concrete example look at SillyTavern. Making fictional characters. Finding narratives, stories, role-play for tabletop games. You can even have group chats of AI characters interacting. No good use cases for profit but plenty right now for exploration and fun.
I'm experimenting with using them to help me make a mod for a game (Vic3). It has a lot of config files and I'm using AI to help generate the data structures and parsers/serializers.
It's coming along but very hit or miss with the small models I'm using (all my poor 6750XT 12GB can manage).
I could be doing something wrong though, even with GPT-4 im struggling a bit - it's very lazy and doesn't want to fully populate the data structure fields (the game file objects can haze dozens/hundredss of fields). I'm probably just not using the right incantation/magic phrase/voodoo?
One idea that I've been mulling over; Given how controllable linux is from the command line, I think it would be somewhat easy to set up a voice to text to a local LLM that could control pretty much everything on command.
It would flat out embarass alexa. Imagine 'Hal play a movie', or 'Hal play some music' and it's all running locally, with your content.
There are a few projects doing this. This one piqued my interest as having a potentially nice UX after some maturity. https://github.com/OpenInterpreter/01
Would you happen to know of any cool OSS model projects that might be good inspiration for a side project?
Wondering what most people use these local models for