Gemini Pro : Free Python API — Embeddings Generation &Text Chat

Abhishek Maheshwarappa
3 min readFeb 15, 2024

--

In this article, we will explore embedding extraction from Gemini Pro for words, phrases, and sentences. We will also use Gemini Pro to have text chats.

My previous article discusses how to configure and set up the API for use; please refer to that for the initial setup here.

For those who prefer a hands-on approach, a Google Colab available for experimentation.

Pic credit — Google

Embeddings

Embeddings are dense vector representations of real-world objects, such as words, images, and text, designed for interpretation by machine learning models. In the context of natural language processing, embeddings can encapsulate the semantic meanings of words or phrases, especially with advanced language models that are capable of understanding nuanced text. Traditional embeddings, like word vectors, often focus on capturing syntactic relationships and may not always represent deeper semantic meanings.

These embeddings open to many applications like

  1. Semantic Search
  2. Clustering
  3. Recommendations
  4. Classification

And many more…!!

Let’s see how to access embeddings from Gemini Pro. For the initial setup, please refer to my previous a article.

Once the setup is done we can start with the embedding extraction.

This is the embedding model from Gemini, which comes with different methods for generating embeddings. One can use the embed_content method to generate embeddings. The method handles embedding for the following tasks:

result = genai.embed_content(
model="models/embedding-001",
content="What is medium?",
task_type="retrieval_document",
title="Embedding of single string")

This method can be used to extract embeddings from a string, a list of strings, or a document.

Please use the Google Notebook to try this method.

Chat Conversation

Gemini also has the capability to chat with users in the form of an assistant. The following code initializes a chat session using the Gemini Pro model:

model = genai.GenerativeModel('gemini-pro')
chat = model.start_chat(history=[])
chat

Use this Google Colab notebook link is here.

References

  1. https://blog.google/technology/ai/google-gemini-ai/
  2. https://ai.google.dev/models/gemini

Stay on the cutting-edge of AI! 🌟 Follow me on Medium, connect on LinkedIn, and explore my GitHub for insightful AI projects about the latest trends in AI technologies and models. Dive into the world of AI with me and discover new horizons! 📚💻

--

--