Quoted:
I guess, it isn’t possible to connect to a local Ollama instance with Applowy right? I host one at home with an nvidia gpu and then vpn into the network.
Message:
If by local, you mean connecting to a Llama/Minstral model that is running on the same machine that AppFlowy is running on, it’s possible with the local AI plugin, though it is a paid plugin.
Timestamp:
2024-12-01T02:56:09.841000+00:00
Attachment:
Discord Message ID:
1312613182329852076