Prerequisites

  • Nebius API key. Sign up for free at AI Studio

Setup

If running on Google Colab

Add NEBIUS_API_KEY to Google Secret Manager.

If running locally

Create an .env file with NEBIUS_API_KEY:
NEBIUS_API_KEY=your_api_key_goes_here

Install Dependencies

If running locally:
import os

if os.getenv("COLAB_RELEASE_TAG"):
   print("Running on Colab")
   RUNNING_ON_COLAB = True
else:
   print("NOT running on Colab")
   RUNNING_ON_COLAB = False
If running on Google Colab:
!pip install -q openai python-dotenv  pydantic

Load Configuration

import os

## Recommended way of getting configuration
if RUNNING_ON_COLAB:
   from google.colab import userdata
   NEBIUS_API_KEY = userdata.get('NEBIUS_API_KEY')
else:
   from dotenv import load_dotenv
   load_dotenv()
   NEBIUS_API_KEY = os.getenv('NEBIUS_API_KEY')


## quick hack (not recommended) - you can hardcode the config key here
# NEBIUS_API_KEY = "your_key_here"

if NEBIUS_API_KEY:
  print ('✅ NEBIUS_API_KEY found')
  os.environ['NEBIUS_API_KEY'] = NEBIUS_API_KEY
else:
  raise RuntimeError ('❌ NEBIUS_API_KEY NOT found')

Pick a Model

We will pick a model that supports function calling.
  1. Go to models tab in studio.nebius.com
  2. Select text to text models
  3. Select function calling filter
  4. Copy the model name. For example Qwen/Qwen3-30B-A3B
Recomended models:
  • Qwen3 family:
    • Qwen/Qwen3-30B-A3B
    • Qwen/Qwen3-235B-A22B
  • Deepseek family:
    • deepseek-ai/DeepSeek-R1-0528
  • Llama:
    • meta-llama/Llama-3.3-70B-Instruct

Define Function Call

Here we will use pydantic to define the schema:
from pydantic import BaseModel, Field
from typing import Literal

class GetCurrentWeatherParams(BaseModel):
    city: str = Field(..., description="The city to find the weather for, e.g. 'San Francisco'")
    unit: Literal['celsius', 'fahrenheit'] = Field(..., description="The unit to fetch the temperature in")

# Now, simulate a tool call
## in real example you will use an API to get the actual weather data
def get_current_weather(city: str, unit: str):
    return ("The weather in San Francisco is 72 degrees fahrenheit. "
            "It is sunny, with highs in the 80's.")

tools = [{
    "type": "function",
    "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": GetCurrentWeatherParams.model_json_schema()
    }
}]

available_tools = {"get_current_weather": get_current_weather}

Tool calling

%%time

import json
import os

from openai import OpenAI

client = OpenAI(
    base_url="https://api.studio.nebius.com/v1/",
    api_key=NEBIUS_API_KEY,
)
messages=[
    {
        "role": "user",
        "content": "Can you tell me what the temperature will be in San Francisco?",
    },
  ]

chat_completion = client.chat.completions.create(
  model = "Qwen/Qwen3-30B-A3B",
  messages=messages,
  tools = tools,
  tool_choice = {
        "type": "function",
        "function": {
            "name": "get_current_weather"
        }
    }
)

messages.append({
    "role": "assistant",
    "tool_calls": chat_completion.choices[0].message.tool_calls
})


completion_tool_calls = chat_completion.choices[0].message.tool_calls
for call in completion_tool_calls:
    tool_to_call = available_tools[call.function.name]
    args = json.loads(call.function.arguments)
    result = tool_to_call(**args)
    print(result)
    messages.append({
        "role": "tool",
        "content": result,
        "tool_call_id": call.id,
        "name": call.function.name
    })
messages