Local LLM models

AIChatOne allows you to connect the app with any local model you want

circle-info

This instruction is for the AIChatOne Web, Extension, Windows version . For the macOS version , due to Apple’s security policy, requests to http protocol are blocked. If you want to connect to the macOS app, you can still follow the instructions here, but with one additional step: you need to setup HTTPS for Ollama. This can be done using various techniques (e.g., using a local HTTPS proxyarrow-up-right). For more details on how to run Ollama on HTTPS, please reach out to the Ollama projectarrow-up-right for support.

circle-info

For Web, Windows, macOS version you have to set environment variables for CORS

Last updated