Brief View
- LM Studio
- Open WebUI
- CloudFlare tunnel
[!IMPORTANT]
All setup based on Mac Mini M4
LM Studio
- download from official website
- enable local server mode, default on
localhost:1234
| |
Open WebUI
[!NOTE]
not support python3.13 at the time of writing
| |
- Go to lcocalhost:8080, register admin account
- In
Settings>Admin settings>Connections>Manage OpenAI API Connections, setURLtohttp://127.0.0.1:1234/v1andkeytonone. - Verify connections to see if LM Studio models are available.
CloudFlare Tunnel
prerequisites: cloudflare account, domain name
I used namesilo before and switched to porkbun this time.
| |
- Go to prompted URL to finish login.
| |
- It will generate a json file in
~/.cloudflared/directory.
Config yaml
- Put in
~/.cloudflared/config.yml
| |
open_webui_port is default to 8080
run in background
| |