Local LLM models
AIChatOne allows you to connect the app with any local model you want
This instruction is for the AIChatOne Web, Extension, Windows version . For the macOS version , due to Appleβs security policy, requests to http
protocol are blocked. If you want to connect to the macOS app, you can still follow the instructions here, but with one additional step: you need to setup HTTPS for Ollama. This can be done using various techniques (e.g., using a local HTTPS proxy). For more details on how to run Ollama on HTTPS, please reach out to the Ollama project for support.
For Web, Windows, macOS version you have to set environment variables for CORS
Last updated