BonzAI Support
Welcome to BonzAI support page. Here you’ll find answers to common questions and resources to help you get started.
About BonzAI
BonzAI (BonzAI-online) is a privacy-first AI assistant that runs directly on your iPhone, iPad, or Mac. Chat with powerful AI models locally on your device or connect to external AI services.
Getting Started
- Download BonzAI from the App Store
- Choose your AI model:
- Use Apple Intelligence (built-in on compatible devices)
- Download local models for offline use
- Connect to external APIs (OpenAI, Gemini, Mistral, etc.)
- Start chatting with your AI assistant
Key Features
- 🤖 Multiple AI Providers: Apple Intelligence, local models (llama.cpp), OpenAI, Google Gemini, Mistral, Anthropic Claude
- 🔒 Privacy First: Local inference keeps your data on your device
- 📱 Cross-Platform: Works on iPhone, iPad, and Mac
- 🎨 Modern Interface: Native SwiftUI design
- 🔧 Customizable: Adjust temperature, top-p, context length, and more
- 🧰 Tool Support: Web search, weather, location, and extensible MCP tools
Frequently Asked Questions
How do I add a local AI model?
- Go to Settings → Available Models
- Tap Add Model
- Select a model from the catalog or Enter a HuggingFace GGUF model URL
- Wait for the download to complete
- Select the model
Why does the app request local network access?
BonzAI uses local network access for two purposes:
- Internal processing: The llama.cpp inference engine uses a localhost-only HTTP server (127.0.0.1:8080) for AI model processing. This server is not accessible from outside your device.
- External services (optional): If you configure external AI services like Ollama or LM Studio running on your local network, the app needs permission to connect to them.
Your data never leaves your device unless you explicitly choose to use external cloud services.
How do I connect to external AI services?
- Go to Settings → Available Models
- Tap Add Model
- Select your provider (OpenAI, Gemini, etc.)
- Enter your API key
- Select the model
Is my data private?
Yes! BonzAI prioritizes your privacy:
- Local models: All processing happens on your device. No data is sent anywhere.
- External services: When you use cloud APIs, data is sent only to the provider you selected. BonzAI does not store or collect your conversations.
- No tracking: We don’t track your usage or collect analytics about your conversations.
See our Privacy Policy for complete details.
Can I use BonzAI offline?
Yes! When using local models (llama.cpp) or Apple Intelligence on compatible devices, BonzAI works completely offline.
What are the system requirements?
- iOS/iPadOS: 26.0 or later
- macOS: 26.0 or later
- Apple Intelligence: Requires compatible devices (iPhone 15 Pro or later, M1 Mac or later)
- Local models: Recommended 4GB+ free storage and 8GB+ RAM
How do I delete my conversations?
- Go to History
- Swipe left on a conversation (left click on macOS)
- Select Delete
You can also delete all app data through iOS Settings → BonzAI → Clear Data.
The app is not responding or crashes. What should I do?
- Force quit the app and restart it
- Check storage: Large models require significant disk space
- Update: Make sure you’re running the latest version from the App Store
- Report the issue: Contact support (see below)
Email Support
For technical issues, feature requests, or general questions:
📧 thomas.leconte.developer@gmail.com
Join our community discussions:
💬 GitHub Discussions
Bug Reports
Report bugs or view known issues:
🐛 GitHub Issues
Additional Resources
Need more help? Don’t hesitate to reach out via email or GitHub Discussions.
© 2025-2026 BonzAI - Thomas LECONTE