Documentation Index
Fetch the complete documentation index at: https://omnifit-64.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Overview
The integration of Google Gemini with OmnifitAI brings cutting-edge generative AI capabilities to your chatbot experience. Gemini, Google’s state-of-the-art multimodal LLM (Large Language Model), enables natural, context-aware, and multi-turn conversations that go beyond standard automation. This guide outlines the step-by-step process to integrate Google Gemini with OmnifitAI.Prerequisites
Before you begin, ensure the following:- You have admin-level access to your OmnifitAI account.
- A valid Google Cloud Project with the Gemini API enabled.
- Access to Gemini API credentials (API key or OAuth token).
- Access to OmnifitAI’s LLM Integration settings.
Steps to Integration
Step 1: Enable Gemini API on Google Cloud
- Visit Google Cloud Console.
- Select or create a new project.
- Go to APIs & Services > Library.
- Search for Gemini API (or Vertex AI API) and click Enable.
Step 2: Generate API Credentials
- Go to APIs & Services > Credentials.
- Click “Create Credentials” → choose API Key.
- Copy the API key and keep it secure.
🔐 Tip: For production-level integrations, use service accounts and OAuth for enhanced security.
Step 3: Add Gemini in OmnifitAI
- Log in to your OmnifitAI dashboard.
- Select the “Third-Party Apps Integrations”
- Search for “Google Gemini”
- Click on Connect, and enter the API key.
FAQ’s
Where can I get the API key for gemini integration?
Where can I get the API key for gemini integration?
You can generate your Gemini API key from the Google Cloud Console:
- Go to APIs & Services > Credentials
- Click “Create Credentials” > API Key
- Ensure that the Gemini (Vertex AI) API is enabled in the selected project.
Can I use both OpenAI and Gemini models in one OmnifitAI account?
Can I use both OpenAI and Gemini models in one OmnifitAI account?
No, one bot can only be Intgrated to one LLM model at a time.
What happens if my API key gets invalid or quota is exceeded?
What happens if my API key gets invalid or quota is exceeded?
If the key becomes invalid or quota limits are hit:
- The bot will fall back to a default message.
- An error notification will appear in the integration logs under
Settings > LLM Integration. - You will receive a platform alert to take corrective action.
