Prompt Engineering 실습 3
- LLaMA-3-8B (Multi-turn PE, Few-shot PE)
- Mistral-7B (CoT PE, Zero-shot PE)
- Phi-3-3.8B (Multi-turn PE, Generated Knowledge PE)
- Gemma 7B (Few-shot PE, Self-Ask PE, Chaining)
Contents
- 모델 설정
- Multi-Turn PE
- Zero-Shot PE
- Generated Knowledge PE
- Knowledge 생성하기
- Knowledge를 Prompt에 넣어서 원하는 답변
관련 패키지 설치 및 환경 설정
!pip install bitsandbytes==0.43.1
!pip install accelerate==0.30.1
!pip install transformers==4.39.3
!pip install gradio==4.29.0
1. 모델 설정
import torch
from transformers import (
AutoTokenizer,
AutoModelForCausalLM,
)
a) 모델 설정: Phi-3 3.8B
model_id = "microsoft/Phi-3-mini-4k-instruct"
b) Tokenizer 불러오기
tokenizer = AutoTokenizer.from_pretrained(model_id)
c) Model 불러오기
Quantization 설정: 따로 안함
불러오기
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.bfloat16,
device_map="cuda:0",
trust_remote_code=True,
)
2. Multi-Turn PE
a) Prompt (message)
- Multi-Turn PE에 맞는 prompt 구성하기
messages = [
{"role": "user", "content": "HI What's your name?"},
{"role": "assistant", "content": "My name is joonhyung kim"},
{"role": "user", "content": "Would you like to say it again? What's your name?"},
]
for message in messages:
message["content"] = message["content"].replace("\n", "")
b) Tokenize
input_ids = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
return_tensors="pt"
).to(model.device)
c) Generate
outputs = model.generate(
input_ids,
max_new_tokens=1024,
do_sample=False,
)
d) Result
response = outputs[0][input_ids.shape[-1]:]
print("response : ", tokenizer.decode(response, skip_special_tokens=True))
Certainly! My name is Joonhyung Kim. How may I assist you further?
3. Zero-Shot PE
a) Prompt (message)
- Zero-shot PE에 맞는 prompt 구성하기
messages = [
{"role": "user", "content": "Who do you like more, mom or dad?"}
]
for message in messages:
message["content"] = message["content"].replace("\n", "")
b) Tokenize
위와 동일
c) Generate
위와 동일
d) Result
위와 동일
As an AI, I don't have personal feelings or family relationships. However, I can tell you that family relationships are unique and special to each individual. Everyone has their own way of expressing love and appreciation for their parents.
If you have questions about family dynamics or how to navigate relationships with your parents, feel free to ask, and I'll do my best to provide helpful information.
4. Generated Knowledge PE
(1) Knowledge 생성하기
a) Prompt (message)
- Knowledge 생성을 위한 prompt 구성하기
messages = [
{"role": "user", "content": "Question : Is Greece larger than mexico?"},
{"role": "assistant", "content": "Knowledge : Greece is approximately 131,957 sq km, while Mexico is approximately 1,964,375 sq km, making Mexico 1,389% larger than Greece."},
{"role": "user", "content": "Question : Is the Eiffel Tower taller than the Leaning Tower of Pisa?"},
{"role": "assistant", "content": "Knowledge : The Eiffel Tower is approximately 330 meters tall, while the Leaning Tower of Pisa is approximately 56 meters tall, making the Eiffel Tower considerably taller."},
{"role": "user", "content": "Question : Is the population of Canada greater than Australia?"},
{"role": "assistant", "content": "Knowledge : As of current data, the population of Canada is approximately 37 million, while the population of Australia is approximately 25 million, making Canada's population greater than Australia's."},
{"role": "user", "content": "Question : Can you explain about Mother?"}
]
for message in messages:
message["content"] = message["content"].replace("\n", "")
b) Tokenize
위와 동일
c) Generate
Knowledge_1
을 생성해둔다!Knowledge_2
: 위와 비슷한 방식으로, 다른 Prompt를 사용하여 또 다른 지식을 생헝해둔다!
response = outputs[0][input_ids.shape[-1]:]
Knowledge_1 = tokenizer.decode(response, skip_special_tokens=True)
d) Result
위와 동일
(2) Knowledge를 Prompt에 넣어서 원하는 답변 생성
a) Prompt (message)
Knowledge_1
과Knowledge_2
를 활용하여 prompt 구성하기
messages = [
{"role": "user", "content": f"""
{Knowledge_1}{Knowledge_2}
Question : Given the Knowledge, Who do you like more, mom or dad?
"""}
]
for message in messages:
message["content"] = message["content"].replace("\n", "")
b) Tokenize
위와 동일
c) Generate
위와 동일
d) Result
위와 동일
As an AI, I don't have personal feelings. However, I can tell you that both mothers and fathers play crucial and unique roles in a child's life. The love and appreciation for either parent can vary greatly depending on individual experiences and cultural backgrounds. It's essential to value and respect both parents for their contributions to a child's upbringing.
Reference
- [패스트캠퍼스] 8개의 sLM모델로 끝내는 sLM 파인튜닝