chat_operator
¤
Chat implementation of OpenAI operator.
Classes:
Name | Description |
---|---|
AzureOpenAIChatOperator |
Chat implementation of OpenAI operator. This is a child of the BaseChatOperator class. See the BaseChatOperator class for further documentation. |
OpenAIChatOperator |
Chat implementation of OpenAI operator. This is a child of the BaseChatOperator class. See the BaseChatOperator class for further documentation. |
AzureOpenAIChatOperator
¤
Bases: OpenAIChatOperator
Chat implementation of OpenAI operator. This is a child of the BaseChatOperator class. See the BaseChatOperator class for further documentation.
Attributes:
Name | Type | Description |
---|---|---|
llm |
AzureOpenAILLM
|
AzureOpenAILLM |
Methods:
Name | Description |
---|---|
compile_template |
Compiles the system prompt. |
parse_output |
Parses the raw output from the LLM into the desired format that was set in the parsed object. |
predict |
Executes prediction using the LLM. |
Attributes:
Name | Type | Description |
---|---|---|
streaming |
bool
|
Returns whether the operator is streaming or not |
compile_template
¤
compile_template() -> Message
Compiles the system prompt. Returns: The compiled system message
Source code in src/declarai/operators/operator.py
178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 |
|
parse_output
¤
Parses the raw output from the LLM into the desired format that was set in the parsed object. Args: output: llm string output
Returns:
Type | Description |
---|---|
Any
|
Any parsed output |
Source code in src/declarai/operators/operator.py
111 112 113 114 115 116 117 118 119 120 |
|
predict
¤
predict(
*,
llm_params: Optional[LLMParamsType] = None,
**kwargs: object
) -> Union[LLMResponse, Iterator[LLMResponse]]
Executes prediction using the LLM.
It first compiles the prompts using the compile
method, and then executes the LLM with the compiled prompts and the llm_params.
Args:
llm_params: The parameters that are passed during runtime. If provided, they will override the ones provided during initialization.
**kwargs: The keyword arguments to pass to the compile
method. Used to format the prompts placeholders.
Returns:
Type | Description |
---|---|
Union[LLMResponse, Iterator[LLMResponse]]
|
The response from the LLM |
Source code in src/declarai/operators/operator.py
92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 |
|
OpenAIChatOperator
¤
Bases: BaseChatOperator
Chat implementation of OpenAI operator. This is a child of the BaseChatOperator class. See the BaseChatOperator class for further documentation.
Attributes:
Name | Type | Description |
---|---|---|
llm |
OpenAILLM
|
OpenAILLM |
Methods:
Name | Description |
---|---|
compile_template |
Compiles the system prompt. |
parse_output |
Parses the raw output from the LLM into the desired format that was set in the parsed object. |
predict |
Executes prediction using the LLM. |
Attributes:
Name | Type | Description |
---|---|---|
streaming |
bool
|
Returns whether the operator is streaming or not |
compile_template
¤
compile_template() -> Message
Compiles the system prompt. Returns: The compiled system message
Source code in src/declarai/operators/operator.py
178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 |
|
parse_output
¤
Parses the raw output from the LLM into the desired format that was set in the parsed object. Args: output: llm string output
Returns:
Type | Description |
---|---|
Any
|
Any parsed output |
Source code in src/declarai/operators/operator.py
111 112 113 114 115 116 117 118 119 120 |
|
predict
¤
predict(
*,
llm_params: Optional[LLMParamsType] = None,
**kwargs: object
) -> Union[LLMResponse, Iterator[LLMResponse]]
Executes prediction using the LLM.
It first compiles the prompts using the compile
method, and then executes the LLM with the compiled prompts and the llm_params.
Args:
llm_params: The parameters that are passed during runtime. If provided, they will override the ones provided during initialization.
**kwargs: The keyword arguments to pass to the compile
method. Used to format the prompts placeholders.
Returns:
Type | Description |
---|---|
Union[LLMResponse, Iterator[LLMResponse]]
|
The response from the LLM |
Source code in src/declarai/operators/operator.py
92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 |
|