Skip to content

chat ¤

Chat tasks definition.

Chat tasks are tasks that are meant to be used in an iterative fashion, where the user and the assistant exchange messages.

Unlike tasks, chat tasks are storing the message history in a BaseChatMessageHistory object, which is used to compile the prompt sent to the LLM.

At every iteration, the user message is added to the message history, and the prompt is compiled using the message history. The prompt is then sent to the LLM, and the response is parsed and added to the message history.

Classes:

Name Description
Chat

Chat class used for creating chat tasks.

ChatDecorator

A decorator class for receiving a chat class, fulfilled with the provided parameters, and returning a Chat object.

ChatMeta

Metaclass for Chat classes. Used to enable the users to receive the chat instance when using the @chat decorator,

Chat ¤

Chat(
    *,
    operator: BaseChatOperator,
    middlewares: List[Type[TaskMiddleware]] = None,
    chat_history: BaseChatMessageHistory = None,
    greeting: str = None,
    system: str = None,
    **kwargs: str
)

Bases: BaseTask

Chat class used for creating chat tasks.

Chat tasks are tasks that are meant to be used in an iterative fashion, where the user and the assistant exchange messages.

Attributes:

Name Type Description
is_declarai bool

A class-level attribute indicating if the chat is of type 'declarai'. Always set to True.

_call_kwargs Dict[str, Any]

A dictionary to store additional keyword arguments, used for passing kwargs between the execution of the chat and the execution of the middlewares.

middlewares List[TaskMiddleware] or None

Middlewares used for every iteration of the chat.

operator BaseChatOperator

The operator used for the chat.

conversation List[Message]

Property that returns a list of messages exchanged in the chat. Keep in mind this list does not include the first system message. The system message is stored in the system attribute.

_chat_history BaseChatMessageHistory

The chat history mechanism for the chat.

greeting str

The greeting message for the chat.

system str

The system message for the chat.

Parameters:

Name Type Description Default
operator BaseChatOperator

The operator to use for the chat.

required
middlewares List[TaskMiddleware]

Middlewares to use for every iteration of the chat. Defaults to

None
chat_history BaseChatMessageHistory

Chat history mechanism to use. Defaults to DEFAULT_CHAT_HISTORY().

None
greeting str

Greeting message to use. Defaults to operator's greeting or None.

None
system str

System message to use. Defaults to operator's system message or None.

None
stream bool

Whether to stream the response from the LLM or not. Defaults to False.

required
**kwargs

Additional keyword arguments to pass to the formatting of the system message.

{}

Methods:

Name Description
__call__

Executes the call to the LLM, based on the messages passed as argument, and the llm_params.

_exec

Executes the call to the LLM.

add_message

Interface to add a message to the chat history.

compile

Compiles a list of messages to be sent to the LLM by the operator.

send

Interface that allows the user to send a message to the LLM. It takes a raw string as input, and returns

stream_cleanup

Add the combined response to the database and run any other cleanup logic.

stream_handler

A generator that yields each chunk from the stream and collects them in a buffer.

Attributes:

Name Type Description
conversation List[Message]

Returns:

llm_params LLMParamsType

Return the LLM parameters that are saved on the operator. These parameters are sent to the LLM when the task is

llm_response LLMResponse

The response from the LLM

llm_stream_response Iterator[LLMResponse]

The response from the LLM when streaming

Source code in src/declarai/chat.py
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
def __init__(
    self,
    *,
    operator: BaseChatOperator,
    middlewares: List[Type[TaskMiddleware]] = None,
    chat_history: BaseChatMessageHistory = None,
    greeting: str = None,
    system: str = None,
    **kwargs,
):
    self.middlewares = middlewares
    self.operator = operator
    self._chat_history = chat_history or DEFAULT_CHAT_HISTORY()
    self.greeting = greeting or self.operator.greeting
    self.system = self.__set_system_prompt(system=system, **kwargs)
    self.__set_memory()

conversation property ¤

conversation: List[Message]

Returns:

Type Description
List[Message]

a list of messages exchanged in the chat. Keep in mind this list does not include the first system message.

llm_params property ¤

llm_params: LLMParamsType

Return the LLM parameters that are saved on the operator. These parameters are sent to the LLM when the task is executed. Returns: The LLM parameters

llm_response instance-attribute ¤

llm_response: LLMResponse

The response from the LLM

llm_stream_response class-attribute instance-attribute ¤

llm_stream_response: Iterator[LLMResponse] = None

The response from the LLM when streaming

__call__ ¤

__call__(
    *,
    messages: List[Message],
    llm_params: LLMParamsType = None
) -> Any

Executes the call to the LLM, based on the messages passed as argument, and the llm_params. The llm_params are passed as a dictionary, and they are used to override the default llm_params of the operator. The llm_params also have priority over the params that were used to initialize the chat within the decorator. Args: messages: The messages to pass to the LLM. llm_params: The llm_params to use for the call to the LLM.

Returns:

Type Description
Any

The parsed response from the LLM.

Source code in src/declarai/chat.py
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
def __call__(
    self, *, messages: List[Message], llm_params: LLMParamsType = None
) -> Any:
    """
    Executes the call to the LLM, based on the messages passed as argument, and the llm_params.
    The llm_params are passed as a dictionary, and they are used to override the default llm_params of the operator.
    The llm_params also have priority over the params that were used to initialize the chat within the decorator.
    Args:
        messages: The messages to pass to the LLM.
        llm_params: The llm_params to use for the call to the LLM.

    Returns:
        The parsed response from the LLM.

    """
    runtime_kwargs = dict(messages=messages)
    runtime_llm_params = (
        llm_params or self.llm_params
    )  # order is important! We prioritize runtime params that
    if runtime_llm_params:
        runtime_kwargs["llm_params"] = runtime_llm_params

    self._call_kwargs = runtime_kwargs
    return self._exec_middlewares(runtime_kwargs)

_exec ¤

_exec(kwargs) -> Any

Executes the call to the LLM.

Parameters:

Name Type Description Default
kwargs

Keyword arguments to pass to the LLM like temperature, max_tokens, etc.

required

Returns:

Type Description
Any

The raw response from the LLM, together with the metadata.

Source code in src/declarai/chat.py
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
def _exec(self, kwargs) -> Any:
    """
    Executes the call to the LLM.

    Args:
        kwargs: Keyword arguments to pass to the LLM like `temperature`, `max_tokens`, etc.

    Returns:
         The raw response from the LLM, together with the metadata.
    """
    if self.operator.streaming:
        # Use the stream_handler generator if streaming is enabled
        stream = self.stream_handler(self.operator.predict(**kwargs))
        self.llm_stream_response = stream
        return self.llm_stream_response
    else:
        self.llm_response = self.operator.predict(**kwargs)
        self.add_message(self.llm_response.response, role=MessageRole.assistant)
        if self.operator.parsed_send_func:
            return self.operator.parsed_send_func.parse(self.llm_response.response)
        return self.llm_response.response

add_message ¤

add_message(message: str, role: MessageRole) -> None

Interface to add a message to the chat history. Args: message (str): The message to add to the chat history. role (MessageRole): The role of the message (assistant, user, system, etc.)

Source code in src/declarai/chat.py
154
155
156
157
158
159
160
161
def add_message(self, message: str, role: MessageRole) -> None:
    """
    Interface to add a message to the chat history.
    Args:
        message (str): The message to add to the chat history.
        role (MessageRole): The role of the message (assistant, user, system, etc.)
    """
    self._chat_history.add_message(Message(message=message, role=role))

compile ¤

compile(**kwargs) -> List[Message]

Compiles a list of messages to be sent to the LLM by the operator. This is done by accessing the ._chat_history.history attribute. The kwargs that are passed to the compile method are onlu used to populate the system message prompt. Args: **kwargs: System message prompt kwargs.

Returns: List[Message] - The compiled messages that will be sent to the LLM.

Source code in src/declarai/chat.py
139
140
141
142
143
144
145
146
147
148
149
150
151
152
def compile(self, **kwargs) -> List[Message]:
    """
    Compiles a list of messages to be sent to the LLM by the operator.
    This is done by accessing the ._chat_history.history attribute.
    The kwargs that are passed to the compile method are onlu used to populate the system message prompt.
    Args:
        **kwargs: System message prompt kwargs.

    Returns: List[Message] - The compiled messages that will be sent to the LLM.

    """
    messages = kwargs.pop("messages", None) or self._chat_history.history
    compiled = self.operator.compile(messages=messages, **kwargs)
    return compiled

send ¤

send(
    message: str,
    llm_params: Union[LLMParamsType, Dict[str, Any]] = None,
    **kwargs: Union[LLMParamsType, Dict[str, Any]]
) -> Any

Interface that allows the user to send a message to the LLM. It takes a raw string as input, and returns the raw response from the LLM. Args: message: llm_params: **kwargs:

Returns:

Type Description
Any

Final response from the LLM, after parsing.

Source code in src/declarai/chat.py
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
def send(
    self,
    message: str,
    llm_params: Union[LLMParamsType, Dict[str, Any]] = None,
    **kwargs,
) -> Any:
    """
    Interface that allows the user to send a message to the LLM. It takes a raw string as input, and returns
     the raw response from the LLM.
    Args:
        message:
        llm_params:
        **kwargs:

    Returns:
        Final response from the LLM, after parsing.

    """
    self.add_message(message, role=MessageRole.user)
    return self(
        messages=self._chat_history.history, llm_params=llm_params, **kwargs
    )

stream_cleanup ¤

stream_cleanup(last_chunk: LLMResponse) -> None

Add the combined response to the database and run any other cleanup logic.

Source code in src/declarai/chat.py
163
164
165
166
167
168
def stream_cleanup(self, last_chunk: LLMResponse) -> None:
    """
    Add the combined response to the database and run any other cleanup logic.
    """
    super().stream_cleanup(last_chunk)
    self.add_message(last_chunk.response, role=MessageRole.assistant)

stream_handler ¤

stream_handler(
    stream: Iterator[LLMResponse],
) -> Iterator[LLMResponse]

A generator that yields each chunk from the stream and collects them in a buffer. After the stream is exhausted, it runs the cleanup logic.

Source code in src/declarai/_base.py
86
87
88
89
90
91
92
93
94
95
96
97
def stream_handler(self, stream: Iterator[LLMResponse]) -> Iterator[LLMResponse]:
    """
    A generator that yields each chunk from the stream and collects them in a buffer.
    After the stream is exhausted, it runs the cleanup logic.
    """
    response_buffer = []
    for chunk in stream:
        response_buffer.append(chunk)
        yield chunk

    # After the stream is exhausted, run the cleanup logic
    self.stream_cleanup(response_buffer[-1])

ChatDecorator ¤

ChatDecorator(llm: LLM)

A decorator class for receiving a chat class, fulfilled with the provided parameters, and returning a Chat object.

This class provides the chat method which acts as a decorator to create a Chat object.

Parameters:

Name Type Description Default
llm LLM

Resolved LLM object.

required

Attributes:

Name Type Description
llm LLM

Resolved LLM object.

Methods:

Name Description
chat

Decorator method that converts a class into a chat task class.

Source code in src/declarai/chat.py
263
264
def __init__(self, llm: LLM):
    self.llm = llm

chat ¤

chat(
    cls: Type = None,
    *,
    middlewares: List[TaskMiddleware] = None,
    llm_params: LLMParamsType = None,
    chat_history: BaseChatMessageHistory = None,
    greeting: str = None,
    system: str = None,
    streaming: bool = None
)

Decorator method that converts a class into a chat task class.

Parameters:

Name Type Description Default
cls Type

The original class that is being decorated.

None
middlewares List[TaskMiddleware]

Middlewares to use for every iteration of the chat. Defaults to None.

None
llm_params LLMParamsType

Parameters for the LLM. Defaults to None.

None
chat_history BaseChatMessageHistory

Chat history mechanism to use. Defaults to None.

None
greeting str

Greeting message to use. Defaults to None.

None
system str

System message to use. Defaults to None.

None
streaming bool

Whether to use streaming or not. Defaults to None.

None

Returns:

Type Description
Type[Chat]

A new Chat class that inherits from the original class and has chat capabilities.

Example
 @ChatDecorator.chat(llm_params={"temperature": 0.5})
 class MyChat:
    ...

 @ChatDecorator.chat
 class MyChat:
    ...
Source code in src/declarai/chat.py
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
def chat(
    self,
    cls: Type = None,
    *,
    middlewares: List[TaskMiddleware] = None,
    llm_params: LLMParamsType = None,
    chat_history: BaseChatMessageHistory = None,
    greeting: str = None,
    system: str = None,
    streaming: bool = None,
):
    """
    Decorator method that converts a class into a chat task class.

    Args:
        cls (Type, optional): The original class that is being decorated.
        middlewares (List[TaskMiddleware], optional): Middlewares to use for every iteration of the chat.
         Defaults to None.
        llm_params (LLMParamsType, optional): Parameters for the LLM. Defaults to None.
        chat_history (BaseChatMessageHistory, optional): Chat history mechanism to use. Defaults to None.
        greeting (str, optional): Greeting message to use. Defaults to None.
        system (str, optional): System message to use. Defaults to None.
        streaming (bool, optional): Whether to use streaming or not. Defaults to None.

    Returns:
        (Type[Chat]): A new Chat class that inherits from the original class and has chat capabilities.


    Example:
        ```python
         @ChatDecorator.chat(llm_params={"temperature": 0.5})
         class MyChat:
            ...

         @ChatDecorator.chat
         class MyChat:
            ...
        ```

    """
    operator_type = resolve_operator(self.llm, operator_type="chat")

    def wrap(cls) -> Type[Chat]:
        non_private_methods = {
            method_name: method
            for method_name, method in cls.__dict__.items()
            if not method_name.startswith("__") and callable(method)
        }
        if "send" in non_private_methods:
            non_private_methods.pop("send")

        parsed_cls = PythonParser(cls)

        _decorator_kwargs = dict(
            operator=operator_type(
                llm=self.llm,
                parsed=parsed_cls,
                llm_params=llm_params,
                streaming=streaming,
            ),
            middlewares=middlewares,
            chat_history=chat_history,
            greeting=greeting,
            system=system,
        )

        new_chat: Type[Chat] = type(cls.__name__, (Chat,), {})  # noqa
        new_chat.__name__ = cls.__name__
        new_chat._init_args = ()  # any positional arguments
        new_chat._init_kwargs = _decorator_kwargs
        for method_name, method in non_private_methods.items():
            if isinstance(method, Task):
                _method = method
            else:
                _method = partial(method, new_chat)
            setattr(new_chat, method_name, _method)
        return new_chat

    if cls is None:
        return wrap
    return wrap(cls)

ChatMeta ¤

Bases: type

Metaclass for Chat classes. Used to enable the users to receive the chat instance when using the @chat decorator, and still be able to "instantiate" the class.

Methods:

Name Description
__call__

Initialize the Chat instance for the second time, after the decorator has been applied. The parameters are

__call__ ¤

__call__(*args, **kwargs)

Initialize the Chat instance for the second time, after the decorator has been applied. The parameters are the same as the ones used for the decorator, but the ones used for the class initialization are precedence. Returns: Chat instance

Source code in src/declarai/chat.py
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
def __call__(cls, *args, **kwargs):
    """
    Initialize the Chat instance for the second time, after the decorator has been applied. The parameters are
    the same as the ones used for the decorator, but the ones used for the class initialization are precedence.
    Returns: Chat instance

    """
    # Determine which arguments to use for initialization
    final_args = args if args else cls._init_args
    final_kwargs = {**cls._init_kwargs, **kwargs}

    # Create and initialize the instance
    instance = super().__call__(*final_args, **final_kwargs)

    # Always set the __name__ attribute on the instance
    instance.__name__ = cls.__name__

    return instance