Self Hosted Llm Macminim4

Brief View

  1. LM Studio
  2. Open WebUI
  3. CloudFlare tunnel
  • [!IMPORTANT]

All setup based on Mac Mini M4

LM Studio

  • download from official website
  • enable local server mode, default on localhost:1234
1
lms server start

Open WebUI

  • [!NOTE]

not support python3.13 at the time of writing

1
2
3
4
python3 -m venv ~/.venv/openwebui
source ~/.venv/openwebui/bin/activate
pip install open-webui
nohup open-webui sevrer > open-webui.log 2>&1 &
  • Go to lcocalhost:8080, register admin account
  • In Settings > Admin settings > Connections > Manage OpenAI API Connections, set URL to http://127.0.0.1:1234/v1 and key to none.
  • Verify connections to see if LM Studio models are available.

CloudFlare Tunnel

  • prerequisites: cloudflare account, domain name

  • I used namesilo before and switched to porkbun this time.

1
2
3
4
5

- If not installed, install by `brew install cloudflared`

```bash
cloudflared login
  • Go to prompted URL to finish login.
1
cloudflared tunnel create <tunnel_name>
  • It will generate a json file in ~/.cloudflared/ directory.

Config yaml

  • Put in ~/.cloudflared/config.yml
1
2
3
4
5
6
7
8
tunnel: <tunnel_name>
credentials-file: /Users/<user>/.cloudflared/<LONG-UUID>.json

ingress:
  - hostname: <hostname>
    service: http://localhost: <open_webui_port>
  # Default catch-all rule
  - service: http_status:404
  • open_webui_port is default to 8080

  • run in background

1
nohup cloudflared tunnel run <tunnel_name> > cloudflare_tunnel.log 2>&1 &