How can we help?

Need help with LocalPort? We're here to ensure your local AI experience is smooth and secure.

Common Questions

Is my data really safe?

Yes. LocalPort uses local ML models for everything. Your documents never touch our servers.

Which AI models do you use?

We use Llama 3.2 1B or 3B (GGUF/Executorch) depending on your device capability.

Can I use it offline?

Yes! Core features like OCR, Chat, and search work entirely without an internet connection.

Contact Support

Can't find what you're looking for? Reach out to our team directly.

Email us at support@localport.ai

Developer Resources

Building an integration for LocalPort? Check out our technical documentation.