Bridge fidth.com to your machine
Two ways to give Fidth real access to your files, shell, and desktop. Option 1 (recommended): the bridge daemon — keep using fidth.com, run a tiny Node script on your PC, agent gets local powers via long-poll. Option 2: run Fidth fully locally with npm run dev.
Connect a bridge
Generate a one-time pairing for this browser. The session ID is stored inlocalStorage; the token is shown once and not retrievable afterwards. Give them to the daemon you run on your machine.
Run Fidth fully locally (alternative)
If you'd rather not use the cloud at all — clone the project and run it yourself. Same UI, same agents, same tools, no bridge needed.
Capabilities unlocked locally
| Capability | Cloud (fidth.com) | Local mode |
|---|---|---|
| Chat with 70+ models (no key) | ✓ | ✓ |
| Web search, fetch, image+video gen | ✓ | ✓ |
| File attach (drag-drop into chat) | ✓ | ✓ |
| Persistent memory (Upstash KV) | with KV key | file or KV |
| Read / write your local files | — | ✓ read_file / write_file |
| Run shell commands | — | ✓ run_shell |
| Control desktop (mouse, keyboard, screenshot) | — | ✓ computer_use (OpenClaw) |
| Local Ollama models (truly offline) | — | ✓ Llama 3.3, Qwen, Phi-4 etc. |
Setup (5 min)
- Clone the project
git clone https://github.com/your-org/fidth-ai cd fidth-ai
If you don't have it on GitHub yet, the source is on your Desktop at
C:\Users\oninn\OneDrive\Desktop\fidth-ai. - Install
npm install
- Create
.env.localwith the powers you want# Optional — better quality models GROQ_API_KEY= GOOGLE_API_KEY= # UNLOCK FILE / SHELL / DESKTOP — local only, NEVER on Vercel ENABLE_LOCAL_FS=1 ENABLE_LOCAL_SHELL=1 ENABLE_COMPUTER_USE=1 # Optional: connect to Ollama for offline models OLLAMA_BASE_URL=http://localhost:11434
⚠ Never set these on Vercel. The shell tool, fs tools, and computer_use tool are powerful — they are gated locally on purpose.
- Run
npm run dev
Open http://localhost:3000 — same UI as fidth.com, but now with full local access.
Optional: Ollama for fully offline AI
If you want zero cloud dependency: install Ollama, pull a model, and Fidth will use it.
ollama pull llama3.3 ollama pull qwen2.5-coder ollama pull deepseek-r1 ollama pull gemma3
The model picker auto-detects whichever models you've pulled.
Try OpenClaw (desktop control)
With ENABLE_COMPUTER_USE=1 set, go to Agent Builder, pick the OpenClaw ✥ preset, save, and switch to it in the sidebar. Then ask things like:
- "Take a screenshot and tell me what's open."
- "Open Notepad and write 'sovereign by default'."
- "Find my Downloads folder and list the 5 newest files."
OpenClaw refuses financial actions (transfers, trades, orders) by default — you handle those yourself.
Architecture: why cloud can't touch your machine
fidth.com runs in Vercel's serverless functions on AWS. When you send a message, the request executes on a stateless container in some datacenter — there's no network path to your PC. This is good (security) and bad (no live local access).
The cloud version still does plenty: 70+ free models, web search, image and video generation, persistent memory, file attach, sub-agent delegation. But for the full sovereign-operator experience — read your code, edit your files, drive your desktop — run it locally.
This is the ConTech bankruptcy-remote self-custody pattern in practice: the engine never touches your machine without an explicit local daemon that you run, with your token, against your workspace. Stop the daemon — the link dies. No institutional override.
Bridge daemon is bound to this chain. Tokens are sovereign-controlled, not platform-issued.