on-device-ai

Hands-On: Mobile AI with Gemma - iOS, Android

In our previous post, Mobile On-device AI: Smarter Faster Private Apps, we explored the fundamentals of running AI locally on mobile devices. Now, it’s time to get hands-on and see this technology in action! This practical guide walks you through implementing mobile on-device AI using Google’s powerful Gemma model family, including the cutting-edge Gemma 3n. You’ll learn to deploy these models across iOS, Android, and web platforms using industry-standard frameworks.

Mobile On-device AI: Smarter Faster Private Apps

While cloud computing drives many AI breakthroughs, a parallel revolution is happening right in our hands - running LLMs locally on mobile devices. This emerging field, known as Mobile On-device AI, enables us to build more private, faster and smarter app experiences - especially as mobile devices become increasingly powerful. As a developer passionate about AI and mobile, I am fascinated by the convergence these two worlds and the possibilities it brings.

Edge AI Essentials

Every day we’re seeing fantastic advancements in AI, thanks to more data and powerful computers. This may make it seem like the future of AI is all about getting even more data and bigger computers. But I believe a critical and rapidly evolving piece of the puzzle is about bringing the Intelligence of Artificial Intelligence onto the devices where the data originates (eg: our phones, cameras, and IoT devices) and doing the “smarts” using their own computing capabilities.