NEBIUS_API_KEY
environment variable.
Create a batch: POST
Request:Copy
Ask AI
client.batches.create(
input_file_id=batch_requests.id,
endpoint="/v1/chat/completions",
completion_window="24h",
metadata={
"description": "Asynchronous job"
}
)
Response
Response
Copy
Ask AI
{
"id": "batch_123",
"object": "batch",
"endpoint": "/v1/chat/completions",
"errors": null,
"input_file_id": "file-123",
"completion_window": "24h",
"status": "validating",
"output_file_id": null,
"error_file_id": null,
"created_at": 1730723835,
"in_progress_at": null,
"expires_at": 1730810235,
"completed_at": null,
"failed_at": null,
"expired_at": null,
"request_counts": {
"total": 0,
"completed": 0,
"failed": 0
},
"metadata": {
"customer_id": "user_123",
"batch_description": "Asynchronous job"
}
}
Get batch info: GET
Request:Copy
Ask AI
import os
from openai import OpenAI
client = OpenAI(
base_url="https://api.studio.nebius.com/v1/",
api_key=os.environ.get("NEBIUS_API_KEY"),
)
client.batches.retrieve("batch_123")
Response
Response
Copy
Ask AI
{
"id": "batch_123",
"object": "batch",
"endpoint": "/v1/chat/completions",
"errors": null,
"input_file_id": "file-123",
"completion_window": "24h",
"status": "validating",
"output_file_id": null,
"error_file_id": null,
"created_at": 1730723835,
"in_progress_at": null,
"expires_at": 1730810235,
"completed_at": null,
"failed_at": null,
"expired_at": null,
"request_counts": {
"total": 0,
"completed": 0,
"failed": 0
},
"metadata": {
"customer_id": "user_123",
"batch_description": "Asynchronous job"
}
}
List batches: GET
Request:Copy
Ask AI
import os
from openai import OpenAI
client = OpenAI(
base_url="https://api.studio.nebius.com/v1/",
api_key=os.environ.get("NEBIUS_API_KEY"),
)
client.batches.list()
Response
Response
Copy
Ask AI
{
"object": "list",
"data": [
{
"id": "batch_123",
"object": "batch",
"endpoint": "/v1/chat/completions",
"errors": null,
"input_file_id": "file-123",
"completion_window": "24h",
"status": "completed",
"output_file_id": "file-re45SV",
"error_file_id": "file-TrOWL1",
"created_at": 1730723835,
"in_progress_at": 1730723839,
"expires_at": 1730810235,
"finalizing_at": 1730797684,
"completed_at": 1730797702,
"failed_at": null,
"expired_at": null,
"cancelling_at": null,
"cancelled_at": null,
"request_counts": {
"total": 120,
"completed": 117,
"failed": 3
},
"metadata": {
"customer_id": "user_123456789",
"batch_description": "Asyncronous job"
}
},
{ ... },
],
"first_id": "batch_123",
"last_id": "batch_987",
"has_more": true
}
Cancel a batch: POST
Request:Copy
Ask AI
client.batches.cancel("batch_123")
Response
Response
Copy
Ask AI
{
"id": "batch_123",
"object": "batch",
"endpoint": "/v1/chat/completions",
"errors": null,
"input_file_id": "file-123",
"completion_window": "24h",
"status": "cancelling",
"output_file_id": null,
"error_file_id": null,
"created_at": 1730723835,
"in_progress_at": null,
"expires_at": 1730810235,
"finalizing_at": null,
"completed_at": null,
"failed_at": null,
"expired_at": null,
"cancelling_at": 1730735840,
"cancelled_at": null,
"request_counts": {
"total": 100,
"completed": 20,
"failed": 0
},
"metadata": {
"customer_id": "user_123",
"batch_description": "Asynchronous job"
}
}