Replies: 1 comment
-
|
yes we can, just connect it to llm panel > use the port > type chatgpt3 on translation and select low vram for lighter processing and start. it does so well on my side. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

Uh oh!
There was an error while loading. Please reload this page.
-
Obviously DeepSeek is taking the world by storm and having an OpenAI alternative that is free and can be self-hosted would be ideal to use with Ballons.
Beta Was this translation helpful? Give feedback.
All reactions