Enrollments closing soon for Post Graduate Certificate Program in Applied Data Science & AI By IIT Roorkee | 3 Seats Left
Apply NowLogin using Social Account
     Continue with GoogleLogin using your credentials
Importing Libraries:
from langchain_openai import ChatOpenAI
from langchain.chains import ConversationalRetrievalChain
Define a function make_chain
that creates a chain of langchain components.
def make_chain():
Now write further code inside this function.
Initialize LLM that will be used to answer queries. We will use gpt-3.5-turbo
model, which is default model used in ChatGPT:
model = ChatOpenAI(
model_name="gpt-3.5-turbo",
temperature=0.0,
verbose=True
)
Initialize prompt using the prompt template we defined:
prompt = get_prompt()
Initialize Retriever. We will initialize our Chroma vector store as retriever.
vector_store = get_chroma_client()
retriever = vector_store.as_retriever(search_type="mmr", verbose=True)
Create the chain of these components. We will use ConversationalRetrievalChain from langchain which is generally used for building chatbots:
chain = ConversationalRetrievalChain.from_llm(
model,
retriever=retriever,
return_source_documents=True,
combine_docs_chain_kwargs=dict(prompt=prompt),
verbose=True,
rephrase_question=False,
)
return chain
Above return_source_documents
when set to True, returns the documents retrieved from the vector store along with the answer. rephrase_question
when set to True, rephrase the user question according to the whole conversation or we can say chat_history. For this demonstration, we will not be using chat_history, that's why we have set it to False.
Taking you to the next exercise in seconds...
Want to create exercises like this yourself? Click here.
No hints are availble for this assesment
Note - Having trouble with the assessment engine? Follow the steps listed here
Loading comments...