Chat with OpenAI in LangChain – #5

With the advent of OpenAI’s Chat-GPT API endpoint (ChatCompletion), LangChain quickly added support for the new endpoint. Unlike previous LLM endpoints, the chat endpoint takes multiple inputs and so has its own unique set of objects and methods.

OpenAI’s `ChatCompletion` endpoint consumes three types of input:

– System message — this acts as an initial prompt to “setup” the behavior of the chat completion.– Human messages — these are human prompts (both current and past) that are fed into the model.– AI messages — past AI responses to the human prompts.

The prior `Completion` endpoint used for other models, like OpenAI’s `text-davinci-003`, only accepted a single `input` field. In `input`, we would write everything.

Because of this difference, we now have the LangChain `ChatOpenAI` object alongside several new prompt templates and “message” objects. We’ll explore how these are used in this chapter.

AI Dev Studio:https://aurelio.ai/

00:00 LangChain’s new Chat modules02:09 New LangChain chat in Python03:14 Using LangChains ChatOpenAI object04:36 Chat messages in LangChain06:43 New chat prompt templates09:05 LangChain human message prompt template13:18 Using multiple chat prompt templates17:42 F-strings vs. LangChain prompt templates19:23 Where to use LangChain chat features?

#artificialintelligence #nlp #openai #deeplearning #langchain