DLI Course "Building RAG Agents for LLM" Need Help With RAG branch for Assessment

Hello @vkudlay , I’m at the final of the Building RAG course and stuck at this point where my RAG route is not working. Can anyone please help to guide me what I am missing ?


This is my code:
`%%writefile server_app.py

🦜️🏓 LangServe | 🦜️🔗 LangChain

from fastapi import FastAPI
from langserve import add_routes

from langchain_nvidia_ai_endpoints import ChatNVIDIA, NVIDIAEmbeddings
from langchain.prompts import ChatPromptTemplate

from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnableLambda, RunnableBranch
from langchain_core.runnables.passthrough import RunnableAssign
from langchain.document_transformers import LongContextReorder
from langchain_community.vectorstores import FAISS

from operator import itemgetter

app = FastAPI(
title=“LangChain Server”,
version=“1.0”,
description=“A simple api server using Langchain’s Runnable interfaces”,
)

llm = ChatNVIDIA(model=‘mixtral_8x7b’)

chat_prompt = ChatPromptTemplate.from_messages([(“system”,
“You are a document chatbot. Help the user as they ask questions about documents.”
" User messaged just asked you a question: {input}\n\n"
" The following information may be useful for your response: "
" Document Retrieval:\n{context}\n\n"
" (Answer only from retrieval. Only cite sources that are used. Make your response conversational)"
), (‘user’, ‘{input}’)])

embedder = NVIDIAEmbeddings(model=‘nvolveqa_40k’)

docstore = FAISS.load_local(“docstore_index”, embedder)
docs = list(docstore.docstore._dict.values())

def docs2str(docs, title=“Document”):
“”“Useful utility for making chunks into context string. Optional, but useful”“”
out_str = “”
for doc in docs:
doc_name = getattr(doc, ‘metadata’, {}).get(‘Title’, title)
if doc_name: out_str += f"[Quote from {doc_name}] "
out_str += getattr(doc, ‘page_content’, str(doc)) + “\n”
return out_str

def output_puller(inputs):
“”““Output generator. Useful if your chain returns a dictionary with key ‘output’””"
for token in inputs:
if token.get(‘output’):
yield token.get(‘output’)

long_reorder = RunnableLambda(LongContextReorder().transform_documents) ## GIVEN
context_getter = itemgetter(‘input’) | docstore.as_retriever() | long_reorder | docs2str
retrieval_chain = {‘input’ : (lambda x: x)} | RunnableAssign({‘context’ : context_getter})

generator_chain = RunnableAssign({“output” : chat_prompt | llm }) ## TODO
generator_chain = generator_chain | output_puller ## GIVEN

rag_chain = retrieval_chain | generator_chain

add_routes(
app,
llm,
path=“/basic_chat”,
)

add_routes(
app,
retrieval_chain,
path=“/retriever”,
)

add_routes(
app,
generator_chain,
path=“/generator”,
)

if name == “main”:
import uvicorn
uvicorn.run(app, host=“0.0.0.0”, port=9012)`

Hey @sgupta13_be22
The same recommendation as here still applies: Difficulty with Building RAG Agents with LLMs (Gradio) - #4 by vkudlay

Feel free to use the docker_router service to see what error is being thrown by the server, but I have a feeling your deployed route is doing too much work and is failing to integrate with the frontend/server_app.py code.

Sir @vkudlay , I tried testing the retriever and generator routes and it seems that retriever route is not working, I have been checking the frontend python and read the RemoteRunnable functionality but still couldn’t find a solution.
Only 1 hour of my course is left and I’m getting nervous (>.<)
Here is the output I’m getting:
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:9012 (Press CTRL+C to quit)
INFO: 127.0.0.1:45630 - “POST /basic_chat/stream HTTP/1.1” 200 OK
INFO: 127.0.0.1:45646 - “POST /retriever/invoke HTTP/1.1” 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File “/usr/local/lib/python3.11/site-packages/pydantic/v1/main.py”, line 716, in validate
value_as_dict = dict(value)
^^^^^^^^^^^
ValueError: dictionary update sequence element #0 has length 1; 2 is required

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File “/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py”, line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py”, line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/fastapi/applications.py”, line 1054, in call
await super().call(scope, receive, send)
File “/usr/local/lib/python3.11/site-packages/starlette/applications.py”, line 116, in call
await self.middleware_stack(scope, receive, send)
File “/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py”, line 186, in call
raise exc
File “/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py”, line 164, in call
await self.app(scope, receive, _send)
File “/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py”, line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File “/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py”, line 55, in wrapped_app
raise exc
File “/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py”, line 44, in wrapped_app
await app(scope, receive, sender)
File “/usr/local/lib/python3.11/site-packages/starlette/routing.py”, line 746, in call
await route.handle(scope, receive, send)
File “/usr/local/lib/python3.11/site-packages/starlette/routing.py”, line 288, in handle
await self.app(scope, receive, send)
File “/usr/local/lib/python3.11/site-packages/starlette/routing.py”, line 75, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File “/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py”, line 55, in wrapped_app
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File “/usr/local/lib/python3.11/site-packages/starlette/routing.py”, line 70, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/fastapi/routing.py”, line 299, in app
raise e
File “/usr/local/lib/python3.11/site-packages/fastapi/routing.py”, line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/fastapi/routing.py”, line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/langserve/server.py”, line 442, in invoke
return await api_handler.invoke(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/langserve/api_handler.py”, line 663, in invoke
config, input
= await self._get_config_and_input(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/langserve/api_handler.py”, line 639, in get_config_and_input
input
= schema.validate(body.input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/site-packages/pydantic/v1/main.py”, line 718, in validate
raise DictError() from e
pydantic.v1.errors.DictError: value is not a valid dict


Can you please suggest what changes should I make?

Plz tell me how to resolve this