Quoted:
If by local, you mean connecting to a Llama/Minstral model that is running on the same machine that AppFlowy client app is running on, it’s possible with the local AI plugin, though it is a paid plugin.
Message:
yeah my local machine doesn’t have that much power, and want to run larger models, so would gladly pay for the appflowy pro version if they could allow a local endpoint(via IP or domain address). By local I mean on the same network, but not the same comp.
Timestamp:
2024-12-01T04:32:38.466000+00:00
Attachment:
Discord Message ID:
1312637461582844016