Three months after its launch in beta, the Vertex AI in Firebase SDK is now ready for production, says Google engineer Thomas Ezan, who further explores three dimensions that are essential for its successful deployment to production: abuse prevention, remote configuration, and responsible AI use.
The Vertex AI in Firebase SDK aims to facilitate the integration of Gemini AI into Android and iOS apps by providing idiomatic APIs, security against unauthorized use, and integration with other Firebase services. By integrating Gemini AI, developers can build AI features into their apps, including AI chat experiences, AI-powered optimization and automation, and more.
A few apps are already using the SDK, explain Ezan, including Meal Planner, which creates original meal plans using AI; the journal app Life, which aims to be an AI diary assistant able to convert conversations into journal entries; and hiking app HiiKER.
Although using an AI service may seem easy, it comes with a few critical specific responsibilities, namely implementing robust security measures to prevent unauthorized access and misuse, preparing for the quick evolution of Gemini models by using remote configuration, and using AI responsibly.
To ensure your app is protected against unauthorized access and misuse, Google provides Firebase App Check:
Firebase App Check helps protect backend resources (like Vertex AI in Firebase, Cloud Functions for Firebase, or even your own custom backend) from abuse. It does this by attesting that incoming traffic is coming from your authentic app running on an authentic and untampered Android device.
The App Check server verifies the attestation using parameters registered with the app and then returns a token with an expiration time. The client caches the token to use it with subsequent requests. In case a request is received without an attestation token, it is rejected.
Remote configuration can be useful to take care of model evolution as well as other parameters that could require to be updated at any time, such as maximum tokens, temperature, safety settings, system instructions, and prompt data. Other important cases where you will want to parametrize your app’s behavior are setting the model location closer to the users, A/B testing system prompts and other model parameters, enabling and disabling AI-related features, etc.
Another key practice highlighted by Ezan is user feedback collection to evaluate user impact:
As you roll out your AI-enabled feature to production, it’s critical to build feedback mechanisms into your product and allow users to easily signal whether the AI output was helpful, accurate, or relevant.
Examples of this are including thumb-up and thumb-down buttons and detailed feedback forms in your app UI.
Last but not least, says Ezan, there is responsibility, which means you should be transparent about AI-based features, you should ensure your users’ data is not used by Google to train their models, and highlight the possibility of unexpected behavior.
All in all, the Vertex AI in Firebase SDK provides an easy road into creating AI-powered mobile apps without developers having to deal with the complexity of Google Cloud or switch to a different programming language to implement an AI backend. However, the Vertex AI in Firebase SDK does not support more advanced use cases, such as streaming, and has a simplified API that is close to direct LLM calls. This makes it less flexible out-of-the-box to build agents, chatbots, or automation. If you need to support streaming or more complex interactions, you can consider using Google GenKit, which additionally offers a free tier for testing purposes.