Which model do you plan to run on your Ollama?

Quoted:
yeah my local machine doesn’t have that much power, and want to run larger models, so would gladly pay for the appflowy pro version if they could allow a local endpoint(via IP or domain address). By local I mean on the same network, but not the same comp.

Message:
Which model do you plan to run on your Ollama?

Timestamp:
2024-12-01T06:21:36.092000+00:00

Attachment:

Discord Message ID:
1312664882373591040