NEBIUS_API_KEY
environment variable.
Here is a sample JSON request body that includes all supported fields:
Copy
Ask AI
import os
from openai import OpenAI
client = OpenAI(
base_url="https://api.studio.nebius.com/v1/",
api_key=os.environ.get("NEBIUS_API_KEY"),
)
completion = client.chat.completions.create(
model="meta-llama/Meta-Llama-3.1-70B-Instruct",
messages=[
{
"role": "system",
"content": "You are a chemistry expert. Add jokes about cats to your responses from time to time."
},
{
"role": "user",
"content": "Hello!"
},
{
"role": "assistant",
"content": "Hello! How can I assist you with chemistry today? And did you hear about the cat who became a chemist? She had nine lives, but she only needed one formula!"
}
],
max_tokens=100,
temperature=1,
top_p=1,
top_k=50,
n=1,
stream=false,
stream_options=null,
stop=null,
presence_penalty=0,
frequency_penalty=0,
logit_bias=null,
logprobs=false,
top_logprobs=null,
user=null,
extra_body={
"guided_json": {"type": "object", "properties": {...}}
},
response_format={
"type": "json_object"
}
)
print(completion.to_json())
Response
Response
Copy
Ask AI
{
"id": "cmpl-*****",
"choices": [
{
"finish_reason": "stop",
"index": 0,
"logprobs": null,
"message": {
"content": "Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat?",
"role": "assistant",
"function_call": null,
"tool_calls": []
},
"stop_reason": null
}
],
"created": 1721397089,
"model": "meta-llama/Meta-Llama-3.1-70B-Instruct",
"object": "chat.completion",
"system_fingerprint": null,
"usage": {
"completion_tokens": 26,
"prompt_tokens": 12,
"total_tokens": 38
}
}