Run AI locally on your Mac for free, private AI generation
AppShots supports Ollama for on-device AI features like text suggestions, translations, and more. With Ollama, your data stays on your local network — no API keys or cloud services needed.
Download and install Ollama from the official website:
After installation, Ollama will appear in your menu bar.
Open Terminal and pull a model. We recommend qwen3.5 for the best balance of speed and quality:
Other models you can try:
| Model | Size | Best For | |
|---|---|---|---|
qwen3.5 |
~2.5 GB | General use, fast responses | Recommended |
llama3.2 |
~2 GB | Lightweight, quick tasks | |
gemma3 |
~3.3 GB | Creative text generation | |
mistral |
~4.1 GB | High quality output | Advanced |
By default, Ollama only listens on localhost. To allow your iPhone/iPad to connect, start it with network access enabled:
Tip: If Ollama is already running from the menu bar app, quit it first (click the Ollama icon in the menu bar → Quit Ollama), then run the command above in Terminal.
To make this permanent, you can set the environment variable:
Open AppShots on your iPhone or iPad:
http://192.168.1.100:11434)Find your Mac's IP: Go to System Settings → Wi-Fi → click your network → look for "IP Address". Or run ipconfig getifaddr en0 in Terminal.
Make sure both devices are on the same Wi-Fi network. Check that Ollama is running with OLLAMA_HOST=0.0.0.0. Try using your Mac's IP address directly instead of auto-discovery.
Check your Mac's firewall settings. Go to System Settings → Network → Firewall and ensure Ollama is allowed to accept incoming connections. You may also need to allow port 11434.
Smaller models like qwen3.5 or llama3.2 run faster, especially on machines with less RAM. Close other heavy applications to free up memory. Apple Silicon Macs with unified memory will perform better.
Make sure you've pulled at least one model. Run ollama list in Terminal to verify. If the list is empty, run ollama pull qwen3.5 to download a model.
Security Note: Running Ollama with OLLAMA_HOST=0.0.0.0 makes it accessible to any device on your local network. Only use this on trusted networks (e.g., your home Wi-Fi). Do not expose Ollama on public networks.
Need help?
Contact us at support@appshots.app