# Local LLM models

AIChatOne allows you to connect the app with any local model you want

{% hint style="info" %}
This instruction is for the AIChatOne  Web, Extension, Windows version . For the macOS version , due to Apple’s security policy, requests to `http` protocol are blocked. If you want to connect to the macOS app, you can still follow the instructions here, but with one additional step: you need to setup HTTPS for Ollama. This can be done using various techniques (e.g., using a [local HTTPS proxy](https://www.npmjs.com/package/local-ssl-proxy)). For more details on how to run Ollama on HTTPS, please reach out to the [Ollama project](https://github.com/ollama/ollama) for support.
{% endhint %}

{% hint style="info" %}
For  Web, Windows, macOS version you have to set environment variables for CORS
{% endhint %}
