Hence at present, if you want to use arbitrary Ollama models, the only way is to write your own web service that offers similar API as AppFlowy AI, and adjust the response based on the actual models you used in Ollama (for both embeddings and chat models)

Quoted:

Message:
Hence at present, if you want to use arbitrary Ollama models, the only way is to write your own web service that offers similar API as AppFlowy AI, and adjust the response based on the actual models you used in Ollama (for both embeddings and chat models)

Timestamp:
2024-12-01T03:02:01.812000+00:00

Attachment:

Discord Message ID:
1312614658603094036