utils.prompters
antitoxin_prompter(history, prompt, system=None)
The antitoxin_prompter function takes in a history of user-assistant interactions, a prompt from the user, and optionally a system response. It returns an input string that can be fed into the antitoxin model to generate an assistant response.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
history |
List[str]
|
typing.List[str]: Pass in the history of the conversation |
required |
prompt |
str
|
str: Pass the user's input to the assistant |
required |
system |
Optional[str]
|
typing.Optional[str]: Pass the system's response to the prompt |
None
|
|
Store the history of user and assistant interaction |
required |
Returns:
Type | Description |
---|---|
A string that contains the user's prompt, |
Source code in src/python/easydel/utils/prompters.py
4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
|
antitoxin_prompter_chat_format(history, system=None)
The antitoxin_prompter_chat_format function takes a list of strings and returns a string. The input is the history of the chat, which is a list of tuples where each tuple contains two strings: the user's message and the assistant's response. The output is formatted as follows:
Parameters:
Name | Type | Description | Default |
---|---|---|---|
history |
List[str]
|
typing.List[str]: Pass in the history of user and assistant messages |
required |
system |
Optional[str]
|
typing.Optional[str]: Pass in the system message |
None
|
|
Store the history of the conversation |
required |
Returns:
Type | Description |
---|---|
A string that contains the system message and |
Source code in src/python/easydel/utils/prompters.py
29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
|
llama2_prompter(history, prompt, system=None)
The llama2_prompter function takes a history of user-system interactions, a prompt for the next system response, and optionally a system response. It returns an LLAMA2 formatted string that can be used as input to the LLAMA2 model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
history |
List[str]
|
typing.List[str]: Store the history of user input and system response |
required |
prompt |
str
|
str: Specify the prompt to be displayed |
required |
system |
Optional[str]
|
typing.Optional[str]: Indicate that the system is optional |
None
|
|
Specify the system's response |
required |
Returns:
Type | Description |
---|---|
A string that is a concatenation of the |
Source code in src/python/easydel/utils/prompters.py
51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
|
llama2_prompter_chat_format(system, messages)
The llama2_prompter_chat_format function takes a system message and a list of messages, and returns the formatted string that can be used to create an LLAMA2 chat file. The system message is optional, and if it is not provided then the function will return only the user messages. The user messages are expected to be in pairs: one for each speaker (system or human). The first element of each pair should be the name of that speaker.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
system |
str
|
str: Store the system message |
required |
messages |
List[str]
|
typing.List[str]: Pass in a list of strings |
required |
|
Add the system message to the beginning of the chat |
required |
Returns:
Type | Description |
---|---|
A string that is the |
Source code in src/python/easydel/utils/prompters.py
83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 |
|