AppShots AppShots
Features How It Works Pricing
Download Free

Ollama Setup Guide

Run AI locally on your Mac for free, private AI generation

AppShots supports Ollama for on-device AI features like text suggestions, translations, and more. With Ollama, your data stays on your local network — no API keys or cloud services needed.

Requirements

  • A Mac with Apple Silicon (M1 or later recommended) or a compatible GPU
  • At least 8 GB of RAM (16 GB recommended for larger models)
  • Your iPhone/iPad and Mac must be on the same Wi-Fi network
  • macOS 11 (Big Sur) or later
1

Install Ollama

Download and install Ollama from the official website:

# Visit https://ollama.com and download the macOS app
# Or install via Homebrew:
brew install ollama

After installation, Ollama will appear in your menu bar.

2

Pull a Model

Open Terminal and pull a model. We recommend qwen3.5 for the best balance of speed and quality:

# Recommended model (~2.5 GB)
ollama pull qwen3.5

Other models you can try:

Model Size Best For
qwen3.5 ~2.5 GB General use, fast responses Recommended
llama3.2 ~2 GB Lightweight, quick tasks
gemma3 ~3.3 GB Creative text generation
mistral ~4.1 GB High quality output Advanced
3

Start Ollama with Network Access

By default, Ollama only listens on localhost. To allow your iPhone/iPad to connect, start it with network access enabled:

# Start Ollama with network access
OLLAMA_HOST=0.0.0.0 ollama serve
💡

Tip: If Ollama is already running from the menu bar app, quit it first (click the Ollama icon in the menu bar → Quit Ollama), then run the command above in Terminal.

To make this permanent, you can set the environment variable:

# Add to your shell profile (~/.zshrc or ~/.bash_profile)
export OLLAMA_HOST=0.0.0.0

# Then reload your shell
source ~/.zshrc
4

Connect from AppShots

Open AppShots on your iPhone or iPad:

  • 1. Go to Settings → AI Provider
  • 2. Select Ollama (Local)
  • 3. Tap Setup Ollama Connection to use the guided wizard
  • 4. Or manually enter your Mac's IP address (e.g., http://192.168.1.100:11434)
💡

Find your Mac's IP: Go to System Settings → Wi-Fi → click your network → look for "IP Address". Or run ipconfig getifaddr en0 in Terminal.

Troubleshooting

AppShots can't find my Ollama server

Make sure both devices are on the same Wi-Fi network. Check that Ollama is running with OLLAMA_HOST=0.0.0.0. Try using your Mac's IP address directly instead of auto-discovery.

Connection times out

Check your Mac's firewall settings. Go to System Settings → Network → Firewall and ensure Ollama is allowed to accept incoming connections. You may also need to allow port 11434.

Model responses are slow

Smaller models like qwen3.5 or llama3.2 run faster, especially on machines with less RAM. Close other heavy applications to free up memory. Apple Silicon Macs with unified memory will perform better.

No models showing in AppShots

Make sure you've pulled at least one model. Run ollama list in Terminal to verify. If the list is empty, run ollama pull qwen3.5 to download a model.

⚠️

Security Note: Running Ollama with OLLAMA_HOST=0.0.0.0 makes it accessible to any device on your local network. Only use this on trusted networks (e.g., your home Wi-Fi). Do not expose Ollama on public networks.

Need help?

Contact us at support@appshots.app

AppShots AppShots

Professional App Store screenshots, made simple.

Product

Features Pricing Download

Support

Contact Ollama Setup Guide

Legal

Privacy Policy Terms & Conditions

© 2026 AppShots. All rights reserved.