Groq provides low latency and lightning-fast inference for AI models. Arize supports instrumenting Groq API calls, including role types such as system, user, and assistant messages, as well as tool use. You can create a free GroqCloud account and generate a Groq API Key here to get started.
Connect to your Phoenix instance using the register function.
Copy
Ask AI
from phoenix.otel import register# configure the Phoenix tracertracer_provider = register( project_name="my-llm-app", # Default is 'default' auto_instrument=True # Auto-instrument your app based on installed OI dependencies)
A simple Groq application that is now instrumented
Copy
Ask AI
import osfrom groq import Groqclient = Groq( # This is the default and can be omitted api_key=os.environ.get("GROQ_API_KEY"),)chat_completion = client.chat.completions.create( messages=[ { "role": "user", "content": "Explain the importance of low latency LLMs", } ], model="mixtral-8x7b-32768",)print(chat_completion.choices[0].message.content)