Why run LFMs on mobile?
Running language models directly on mobile devices offers several key advantages:- Privacy: Keep user data on-device without sending sensitive information to the cloud
- Offline capability: Enable AI features that work without an internet connection
- Low latency: Eliminate network round-trips for faster response times
- Cost efficiency: Reduce or eliminate API costs for inference
- Always available: Provide consistent AI experiences regardless of connectivity
Supported platforms
The LEAP Edge SDK supports native development on both major mobile platforms:- Android: Native Kotlin integration with examples for chat, audio processing, vision models, and structured output
- iOS: Native Swift/SwiftUI integration with examples for chat, text generation, audio, and structured data
Android deployment
Deploy LFM models on Android devices with native Kotlin integration
iOS deployment
Deploy LFM models on iOS devices with native Swift integration
Common use cases
The LEAP Edge SDK examples demonstrate a variety of real-world applications:- Chat interfaces: Build conversational AI apps with real-time streaming and persistent conversation history
- Audio processing: Process speech input and generate audio output for voice-enabled features
- Vision models: Integrate visual language models for image understanding and generation
- Structured output: Generate formatted data like recipes, summaries, or structured responses
- Agent frameworks: Build AI agents with tool calling and multi-step reasoning capabilities
- Content generation: Create marketing copy, summaries, and other text content on-device