pip install langchain
Prompts
(초록색): 입력 템플릿화, 동적으로 선택, and 모델 입력 관리Language models
(보라색): 공통 인터페이스를 통해 언어 모델을 호출합니다.Output parsers
(파란색): 모델 출력에서 정보 추출=> 해당 파트에서는 Prompts
를 다룬다.
프롬프트 설명에 대한 포멧을 정하는 것임. 프롬프트 엔지니어링에 포함되는 영역
실제 chat GPT가 나오면 이를 잘 사용하기 위한 방법들이 많이 연구됨.
act as a JavaScript Console
I want you to act as a javascript console.
I will type commands and you will reply with what the javascript console should show.
I want you to only reply with the terminal output inside one unique code block, and nothing else.
do not write explanations. do not type commands unless I instruct you to do so. when I need to tell you something in english, I will do so by putting text inside curly brackets {like this}.
My first command is console.log("Hello World");
기본 예시
PromptTemplate.from_template
: 템플릿 설정 prompt.format({변수} = "입력하고자 하는 값")
from langchain import PromptTemplate
template = """\
You are a naming consultant for new companies.
What is a good name for a company that makes {product}?
"""
prompt = PromptTemplate.from_template(template)
prompt.format(product="colorful socks")
input_variables 활용 예시
from langchain import PromptTemplate
# An example prompt with no input variables
no_input_prompt = PromptTemplate(input_variables=[], template="Tell me a joke.")
no_input_prompt.format()
# -> "Tell me a joke."
# An example prompt with one input variable
one_input_prompt = PromptTemplate(input_variables=["adjective"], template="Tell me a {adjective} joke.")
one_input_prompt.format(adjective="funny")
# -> "Tell me a funny joke."
# An example prompt with multiple input variables
multiple_input_prompt = PromptTemplate(
input_variables=["adjective", "content"],
template="Tell me a {adjective} joke about {content}."
)
multiple_input_prompt.format(adjective="funny", content="chickens")
# -> "Tell me a funny joke about chickens."
기본 import
from langchain.prompts import (
ChatPromptTemplate,
PromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate,
HumanMessagePromptTemplate,
)
from langchain.schema import (
AIMessage,
HumanMessage,
SystemMessage
)
기본 예시
template="You are a helpful assistant that translates {input_language} to {output_language}."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template="{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
SystemMessagePromptTemplate
: openAI play ground System 역할HumanMessagePromptTemplate
: openAI play ground user 역할input_variables 활용 예시
prompt=PromptTemplate(
template="You are a helpful assistant that translates {input_language} to {output_language}.",
input_variables=["input_language", "output_language"],
)
system_message_prompt_2 = SystemMessagePromptTemplate(prompt=prompt)
template + chat 예시
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
# get a chat completion from the formatted messages
chat_prompt.format_prompt(input_language="English", output_language="French", text="I love programming.").to_messages()
#출력
[SystemMessage(content='You are a helpful assistant that translates English to French.', additional_kwargs={}),
HumanMessage(content='I love programming.', additional_kwargs={})]
Connecting to a Feature Store
의 파트의 경우 Feast
의 저장소를 활용하는 방법으로 특정 데이터를 저장소에서 가져와서 사용하는 방식으로 필요하신 분은 참조하길 바란다.