chat
¤
Chat tasks definition.
Chat tasks are tasks that are meant to be used in an iterative fashion, where the user and the assistant exchange messages.
Unlike tasks, chat tasks are storing the message history in a BaseChatMessageHistory
object, which is used to compile
the prompt sent to the LLM.
At every iteration, the user message is added to the message history, and the prompt is compiled using the message history. The prompt is then sent to the LLM, and the response is parsed and added to the message history.
Classes:
Name | Description |
---|---|
Chat |
Chat class used for creating chat tasks. |
ChatDecorator |
A decorator class for receiving a chat class, fulfilled with the provided parameters, and returning a Chat object. |
ChatMeta |
Metaclass for Chat classes. Used to enable the users to receive the chat instance when using the @chat decorator, |
Chat
¤
Chat(
*,
operator: BaseChatOperator,
middlewares: List[Type[TaskMiddleware]] = None,
chat_history: BaseChatMessageHistory = None,
greeting: str = None,
system: str = None,
**kwargs: str
)
Bases: BaseTask
Chat class used for creating chat tasks.
Chat tasks are tasks that are meant to be used in an iterative fashion, where the user and the assistant exchange messages.
Attributes:
Name | Type | Description |
---|---|---|
is_declarai |
bool
|
A class-level attribute indicating if the chat is of type 'declarai'. Always set to |
_call_kwargs |
Dict[str, Any]
|
A dictionary to store additional keyword arguments, used for passing kwargs between the execution of the chat and the execution of the middlewares. |
middlewares |
List[TaskMiddleware] or None
|
Middlewares used for every iteration of the chat. |
operator |
BaseChatOperator
|
The operator used for the chat. |
conversation |
List[Message]
|
Property that returns a list of messages exchanged in the chat.
Keep in mind this list does not include the first system message. The system message is stored in the |
_chat_history |
BaseChatMessageHistory
|
The chat history mechanism for the chat. |
greeting |
str
|
The greeting message for the chat. |
system |
str
|
The system message for the chat. |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
operator |
BaseChatOperator
|
The operator to use for the chat. |
required |
middlewares |
List[TaskMiddleware]
|
Middlewares to use for every iteration of the chat. Defaults to |
None
|
chat_history |
BaseChatMessageHistory
|
Chat history mechanism to use. Defaults to
|
None
|
greeting |
str
|
Greeting message to use. Defaults to operator's greeting or None. |
None
|
system |
str
|
System message to use. Defaults to operator's system message or None. |
None
|
stream |
bool
|
Whether to stream the response from the LLM or not. Defaults to False. |
required |
**kwargs |
Additional keyword arguments to pass to the formatting of the system message. |
{}
|
Methods:
Name | Description |
---|---|
__call__ |
Executes the call to the LLM, based on the messages passed as argument, and the llm_params. |
_exec |
Executes the call to the LLM. |
add_message |
Interface to add a message to the chat history. |
compile |
Compiles a list of messages to be sent to the LLM by the operator. |
send |
Interface that allows the user to send a message to the LLM. It takes a raw string as input, and returns |
stream_cleanup |
Add the combined response to the database and run any other cleanup logic. |
stream_handler |
A generator that yields each chunk from the stream and collects them in a buffer. |
Attributes:
Name | Type | Description |
---|---|---|
conversation |
List[Message]
|
Returns: |
llm_params |
LLMParamsType
|
Return the LLM parameters that are saved on the operator. These parameters are sent to the LLM when the task is |
llm_response |
LLMResponse
|
The response from the LLM |
llm_stream_response |
Iterator[LLMResponse]
|
The response from the LLM when streaming |
Source code in src/declarai/chat.py
101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 |
|
conversation
property
¤
llm_params
property
¤
llm_params: LLMParamsType
Return the LLM parameters that are saved on the operator. These parameters are sent to the LLM when the task is executed. Returns: The LLM parameters
llm_stream_response
class-attribute
instance-attribute
¤
llm_stream_response: Iterator[LLMResponse] = None
The response from the LLM when streaming
__call__
¤
__call__(
*,
messages: List[Message],
llm_params: LLMParamsType = None
) -> Any
Executes the call to the LLM, based on the messages passed as argument, and the llm_params. The llm_params are passed as a dictionary, and they are used to override the default llm_params of the operator. The llm_params also have priority over the params that were used to initialize the chat within the decorator. Args: messages: The messages to pass to the LLM. llm_params: The llm_params to use for the call to the LLM.
Returns:
Type | Description |
---|---|
Any
|
The parsed response from the LLM. |
Source code in src/declarai/chat.py
201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 |
|
_exec
¤
_exec(kwargs) -> Any
Executes the call to the LLM.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
kwargs |
Keyword arguments to pass to the LLM like |
required |
Returns:
Type | Description |
---|---|
Any
|
The raw response from the LLM, together with the metadata. |
Source code in src/declarai/chat.py
170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 |
|
add_message
¤
add_message(message: str, role: MessageRole) -> None
Interface to add a message to the chat history. Args: message (str): The message to add to the chat history. role (MessageRole): The role of the message (assistant, user, system, etc.)
Source code in src/declarai/chat.py
154 155 156 157 158 159 160 161 |
|
compile
¤
Compiles a list of messages to be sent to the LLM by the operator. This is done by accessing the ._chat_history.history attribute. The kwargs that are passed to the compile method are onlu used to populate the system message prompt. Args: **kwargs: System message prompt kwargs.
Returns: List[Message] - The compiled messages that will be sent to the LLM.
Source code in src/declarai/chat.py
139 140 141 142 143 144 145 146 147 148 149 150 151 152 |
|
send
¤
send(
message: str,
llm_params: Union[LLMParamsType, Dict[str, Any]] = None,
**kwargs: Union[LLMParamsType, Dict[str, Any]]
) -> Any
Interface that allows the user to send a message to the LLM. It takes a raw string as input, and returns the raw response from the LLM. Args: message: llm_params: **kwargs:
Returns:
Type | Description |
---|---|
Any
|
Final response from the LLM, after parsing. |
Source code in src/declarai/chat.py
226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 |
|
stream_cleanup
¤
stream_cleanup(last_chunk: LLMResponse) -> None
Add the combined response to the database and run any other cleanup logic.
Source code in src/declarai/chat.py
163 164 165 166 167 168 |
|
stream_handler
¤
stream_handler(
stream: Iterator[LLMResponse],
) -> Iterator[LLMResponse]
A generator that yields each chunk from the stream and collects them in a buffer. After the stream is exhausted, it runs the cleanup logic.
Source code in src/declarai/_base.py
86 87 88 89 90 91 92 93 94 95 96 97 |
|
ChatDecorator
¤
ChatDecorator(llm: LLM)
A decorator class for receiving a chat class, fulfilled with the provided parameters, and returning a Chat object.
This class provides the chat
method which acts as a decorator to create a Chat object.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
llm |
LLM
|
Resolved LLM object. |
required |
Attributes:
Name | Type | Description |
---|---|---|
llm |
LLM
|
Resolved LLM object. |
Methods:
Name | Description |
---|---|
chat |
Decorator method that converts a class into a chat task class. |
Source code in src/declarai/chat.py
263 264 |
|
chat
¤
chat(
cls: Type = None,
*,
middlewares: List[TaskMiddleware] = None,
llm_params: LLMParamsType = None,
chat_history: BaseChatMessageHistory = None,
greeting: str = None,
system: str = None,
streaming: bool = None
)
Decorator method that converts a class into a chat task class.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
cls |
Type
|
The original class that is being decorated. |
None
|
middlewares |
List[TaskMiddleware]
|
Middlewares to use for every iteration of the chat. Defaults to None. |
None
|
llm_params |
LLMParamsType
|
Parameters for the LLM. Defaults to None. |
None
|
chat_history |
BaseChatMessageHistory
|
Chat history mechanism to use. Defaults to None. |
None
|
greeting |
str
|
Greeting message to use. Defaults to None. |
None
|
system |
str
|
System message to use. Defaults to None. |
None
|
streaming |
bool
|
Whether to use streaming or not. Defaults to None. |
None
|
Returns:
Type | Description |
---|---|
Type[Chat]
|
A new Chat class that inherits from the original class and has chat capabilities. |
Example
@ChatDecorator.chat(llm_params={"temperature": 0.5})
class MyChat:
...
@ChatDecorator.chat
class MyChat:
...
Source code in src/declarai/chat.py
300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 |
|
ChatMeta
¤
Bases: type
Metaclass for Chat classes. Used to enable the users to receive the chat instance when using the @chat decorator, and still be able to "instantiate" the class.
Methods:
Name | Description |
---|---|
__call__ |
Initialize the Chat instance for the second time, after the decorator has been applied. The parameters are |
__call__
¤
__call__(*args, **kwargs)
Initialize the Chat instance for the second time, after the decorator has been applied. The parameters are the same as the ones used for the decorator, but the ones used for the class initialization are precedence. Returns: Chat instance
Source code in src/declarai/chat.py
45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
|