Gemini Support in PhaseLLM v0.0.21
PhaseLLM now supports Gemini and other Vertex AI models from Google. This blog post provides advice around how to set this up and use the API.
Code Sample
Gemini uses the "VertexAIWrapper" for LLM calls. This wrapper works with the PhaseLLM "ChatBot" class and all other tools we've built.
from phasellm.llms import VertexAIWrapper, ChatBot
# Set project you're using via Vertex. Can be done outside of Python, too.
from google.cloud import aiplatform
aiplatform.init(project="your-project-name")
# Standard PhaseLLM calls.
v = VertexAIWrapper('gemini-1.0-pro')
cb = ChatBot(v)
cb.chat("Hi, how are you?")
Setting Up GCP
Unfortunately, Google's authentication requirements are not as simple as providing an API key. You'll need to complete the following steps...
Install the Google Cloud SDK. This assumes you have a Google Cloud account.
Authenticate with gcloud. This will enable you to make calls from your terminal/CLI and Python.
If you haven't already done so, then you will likely need to set up a GCP billing and a project that has permissions for Vertex AI calls.
Questions? Cool Use Cases?
As always, reach out to hello --at-- phaseai --dot-- com if you have questions. If you're exploring interesting use cases with GCP, let us know as well!