Hi There,
I’m after some help with the Building RAG Agents with LLMs assignment on section 8. I’ve updated the 35_langserve.ipynb with the following:
%%writefile server_app.py
🦜️🏓 LangServe | 🦜️🔗 LangChain
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate
from langchain_nvidia_ai_endpoints import ChatNVIDIA
from langchain_community.chat_models import ChatAnthropic, ChatOpenAI
from langserve import add_routes
import uvicorn
llm = ChatNVIDIA(model=“mistralai/mixtral-8x7b-instruct-v0.1”)
app = FastAPI(
title=“LangChain Server”,
version=“1.0”,
description=“A simple api server using Langchain’s Runnable interfaces”,
)
add_routes(
app,
llm,
path=“/basic_chat”,
)
add_routes(
app,
llm,
path=“/retriever”,
)
add_routes(
app,
llm,
path=“/generator”,
)
Might be encountered if this were for a standalone python file…
if name == “main”:
uvicorn.run(app, host=“0.0.0.0”, port=9012)
I then run Gradio as per the last step in section 8, but continually get the below error. I’ve tried various things with no joy, any tips?