Hi everyone,
I have been testing your API with some frameworks, like Genkit and I realized that your API is not 100% compatible with OpenAI. For example, if I try this request (which works with OpenAI):
{
“model”: “meta/llama-3.1-405b-instruct”,
“messages”: [
{
“role”: “user”,
“content”: [
{
“type”: “text”,
“text”: “tell me a joke about hello”
}
]
}
],
“temperature”: 0.7,
“max_tokens”: 100,
“top_p”: 1
}
it just fails with this error: {“type”:“urn:inference-service:problem-details:bad-request”, “title”:“Bad Request”, “status”:400, “detail”:“Inference error”}
Is there an estimation of when your API is going to be 100% compatible with OpenAI API?
Best