AIChatOne Docs
Home
  • πŸ‘‹Welcome to AIChatOne Docs
  • Getting started
    • πŸ’‘Get started with AIChatOne
    • ✨Compare AIChatOne Plans
  • Product Guides
    • πŸ’¬Chat Models Settings
      • Use webapp
      • Get OpenAI API key
      • GPT-4 Access
      • Use Azure OpenAI
      • Use Poe
      • Use OpenRouter
      • Context length limit
    • πŸ€–Parameter Settings
      • Parameter settings
      • Suggested parameter combinations
    • ☁️Export / import data
    • πŸ’»Set Up System Message
      • Initial system instruction
      • Your profile
    • πŸ“”Chat Management
      • Organize chats
      • Share chats
    • πŸ“–Prompt library
    • 🎭AI Characters
    • πŸ“„Upload docs
    • πŸŽ™οΈVoice Input
    • πŸ‘„Text-to-speech
    • πŸ”Show AI Response on Search Page
    • πŸ“–Read/Summarize Web Page
    • πŸ“ΊYoutube Summarize
    • βœ–οΈTwitter (X) write assistant
    • πŸ‘½Reddit reply assistant
    • πŸ€–Custom Chatbots
      • OpenAI
      • Anyscale
      • Local LLM models
        • Ollama
        • Local AI
        • Xinference
      • Together.ai
      • OpenRouter
      • Mistral AI
  • Others
    • πŸ‘¨β€πŸ’»Manage License & Devices
    • ❔FAQs
    • 🌐Translation collaboration
    • πŸ›Troubleshooting
    • πŸ§‘β€πŸ€β€πŸ§‘Contact Us
    • πŸŽ‡Changelog
Powered by GitBook
On this page

Was this helpful?

  1. Product Guides
  2. Custom Chatbots

Local LLM models

PreviousAnyscaleNextOllama

Last updated 1 year ago

Was this helpful?

AIChatOne allows you to connect the app with any local model you want

This instruction is for the AIChatOne Web, Extension, Windows version . For the macOS version , due to Apple’s security policy, requests to http protocol are blocked. If you want to connect to the macOS app, you can still follow the instructions here, but with one additional step: you need to setup HTTPS for Ollama. This can be done using various techniques (e.g., using a ). For more details on how to run Ollama on HTTPS, please reach out to the for support.

For Web, Windows, macOS version you have to set environment variables for CORS

πŸ€–
local HTTPS proxy
Ollama project