O

OCRBook

Ollama setup tutorial

Connect OCRBook to your own Ollama server.

Install Ollama, run a model, then paste your server address into OCRBook to use local AI tools (summarize / translate / Q&A).

Important security note

If you connect over plain HTTP, traffic may not be encrypted. Use a trusted network, VPN, or HTTPS reverse proxy when possible.

Setup steps

Follow these steps on the machine that will run Ollama.
Step 1 Install Ollama

Download and install Ollama for your OS (macOS / Windows / Linux). After install, start the Ollama service/app.

Step 2 Pull a model

Open Terminal (or PowerShell) and run: ollama pull llama3.2 (example). You can use any model you prefer.

Step 3 Run a quick test

Run: ollama run llama3.2 then type a message to confirm it responds.

Step 4 Allow access from your iPhone/iPad (LAN)

By default Ollama may bind to localhost. To access from another device, set the host to 0.0.0.0 and allow the port in your firewall. Typical port is 11434.

Example (macOS/Linux)
export OLLAMA_HOST=0.0.0.0:11434
ollama serve
Example (Windows PowerShell)
$env:OLLAMA_HOST="0.0.0.0:11434"
ollama serve
Step 5 Find your server address

On the Ollama machine, find your local IP (e.g. 192.168.0.10). Your OCRBook server URL will look like: http://192.168.0.10:11434

Step 6 Connect in OCRBook

In OCRBook settings, enable the Ollama integration and paste the server URL. Then try a small test prompt (summarize a short page).

Security (recommended)

If you use this outside your home network, protect it.
Prefer VPN or HTTPS

For remote use, connect through VPN (recommended) or put Ollama behind an HTTPS reverse proxy (Caddy / Nginx).

Restrict access

Limit inbound access to your LAN, or allow only your device IPs. Don’t expose 11434 publicly without protection.

Troubleshooting

OCRBook can’t connect to my server.
Check that your phone and server are on the same network, the IP is correct, and port 11434 is allowed by firewall. Try opening the server URL in a browser.
It works on the server but not from another device.
Ollama might be bound to localhost. Ensure OLLAMA_HOST is set to 0.0.0.0:11434 and the service was restarted.
The connection is slow.
Use a smaller model, ensure the server has enough RAM/VRAM, and prefer wired Ethernet or strong Wi‑Fi. Large PDFs should be summarized in smaller chunks.
Language