@prologic@twtxt.net to interact with locally running LLM, which software are you using? LM Studio, or Ollama, or…?
@bender@twtxt.net Ollama currently. It’s been rock solid.
@prologic@twtxt.net what do you use for front end?
@prologic@twtxt.net that one is archived. It seems what’s followed is https://github.com/open-webui/open-webui
@bender@twtxt.net Ahh yeah that’s the one!