Hey, Thanks for the access to write about the problem, i want to use the meta/llama-4-scout-17b-16e-instruct through api but it says this:
Status: 404
Response: {“status”:404,“title”:“Not Found”,“detail”:“Function ‘b6bb6e01-780e-4ba0-a5b8-379f00ed9b1c’: Not found for account ‘cL_otl9NJS8fDxZdJoXnjmq7srGCuM6t2LXdrdeBC_w’”}
this is the code:
import requests
API_KEY = “your-api”
Correct endpoint for multimodal models
invoke_url = “https://integrate.api.nvidia.com/v1/chat/completions”
headers = {
“Authorization”: f"Bearer {API_KEY}",
“Accept”: “application/json”,
“Content-Type”: “application/json”
}
payload = {
“model”: “meta/llama-4-scout-17b-16e-instruct”,
“messages”: [
{“role”: “user”, “content”: “Hello, what can you do?”}
],
“max_tokens”: 512,
“stream”: False
}
response = requests.post(invoke_url, headers=headers, json=payload)
print(“Status:”, response.status_code)
print(“Response:”, response.text)
i test with this if model is available or not but i can see its there:
import requests
response = requests.get(
“https://integrate.api.nvidia.com/v1/models”,
headers={“Authorization”: “Bearer your-api”}
)
print(response.json())
as you can see the response of above script in the image below:
