Skip to content

Instantly share code, notes, and snippets.

@aquan9
aquan9 / gist:58f4a77414a74703157bf79ea1bf009f
Created April 1, 2025 11:08
Using litellm with chutes.ai from the bit-tensor chutes subnet.
import litellm
import os
response = litellm.completion(
model="openai/deepseek-ai/DeepSeek-V3-0324", # add `openai/` prefix to model so litellm knows to route to OpenAI
api_key="<Add your API key here>", # api key to your openai compatible endpoint
api_base="https://llm.chutes.ai/v1", # set API Base of your Custom OpenAI Endpoint
messages=[
{
"role": "user",