Hi all,
while trying to run one of the example using OCI Generative ai as provider I get stucked at this error:
nemoguardrails.actions.llm.utils.LLMCallException: LLM Call Exception: model_id is required to derive the provider, please provide the provider explicitly or specify the model_id to derive the provider.
I can’t figure out what’s wrong, here follows my config file:
models:
- type: main
engine: oci_generative_ai
model: cohere.command-r-plus-08-2024
model_id: cohere.command-r-plus-08-2024
endpoint: https://inference.generativeai.us-chicago-1.oci.oraclecloud.com
compartment_id: ocid1.compartment.YYYYYYYYY
Here’s the program am trying to run:
from nemoguardrails import RailsConfig
from nemoguardrails import LLMRails
config = RailsConfig.from_path(“./config”)
rails = LLMRails(config)
response = rails.generate(messages=[{
“role”: “user”,
“content”: “Hello!”
}])
print(response)
Anyone knows what am doing wrong ?