Skip to content

task_operator ¤

Task implementation for openai operator.

Classes:

Name Description
AzureOpenAITaskOperator

Task implementation for openai operator that uses Azure as the llm provider.

OpenAITaskOperator

Task implementation for openai operator. This is a child of the BaseOperator class. See the BaseOperator class for further documentation.

AzureOpenAITaskOperator ¤

Bases: OpenAITaskOperator

Task implementation for openai operator that uses Azure as the llm provider.

Attributes:

Name Type Description
llm AzureOpenAILLM

AzureOpenAILLM

Methods:

Name Description
_compile_input_placeholder

Creates a placeholder for the input of the function.

compile

Implements the compile method of the BaseOperator class.

compile_template

Unique compilation method for the OpenAITaskOperator class.

parse_output

Parses the raw output from the LLM into the desired format that was set in the parsed object.

predict

Executes prediction using the LLM.

Attributes:

Name Type Description
streaming bool

Returns whether the operator is streaming or not

streaming property ¤

streaming: bool

Returns whether the operator is streaming or not Returns:

_compile_input_placeholder ¤

_compile_input_placeholder() -> str

Creates a placeholder for the input of the function. The input format is based on the function input schema.

Example

for example a function signature of:

def foo(a: int, b: str, c: float = 1.0):

will result in the following placeholder:

    Inputs:
    a: {a}
    b: {b}
    c: {c}

Source code in src/declarai/operators/openai_operators/task_operator.py
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
def _compile_input_placeholder(self) -> str:
    """
    Creates a placeholder for the input of the function.
    The input format is based on the function input schema.

    !!! example
        for example a function signature of:
            ```py
            def foo(a: int, b: str, c: float = 1.0):
            ```

        will result in the following placeholder:
        ```md
            Inputs:
            a: {a}
            b: {b}
            c: {c}
        ```
    """
    inputs = ""

    if not self.parsed.signature_kwargs.keys():
        return inputs

    for i, param in enumerate(self.parsed.signature_kwargs.keys()):
        if i == 0:
            inputs += INPUT_LINE_TEMPLATE.format(param=param)
            continue
        inputs += NEW_LINE_INPUT_LINE_TEMPLATE.format(param=param)

    return INPUTS_TEMPLATE.format(inputs=inputs)

compile ¤

compile(**kwargs) -> CompiledTemplate

Implements the compile method of the BaseOperator class. Args: **kwargs:

Returns:

Type Description
CompiledTemplate

Dict[str, List[Message]]: A dictionary containing a list of messages.

Source code in src/declarai/operators/operator.py
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
def compile(self, **kwargs) -> CompiledTemplate:
    """
    Implements the compile method of the BaseOperator class.
    Args:
        **kwargs:

    Returns:
        Dict[str, List[Message]]: A dictionary containing a list of messages.

    """
    template = self.compile_template()
    if kwargs:
        template[-1].message = format_prompt_msg(
            _string=template[-1].message, **kwargs
        )
    return {"messages": template}

compile_template ¤

compile_template() -> CompiledTemplate

Unique compilation method for the OpenAITaskOperator class. Uses the InstructFunctionTemplate and StructuredOutputInstructionPrompt templates to create a message. And the _compile_input_placeholder method to create a placeholder for the input of the function. Returns: Dict[str, List[Message]]: A dictionary containing a list of messages.

Source code in src/declarai/operators/openai_operators/task_operator.py
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
def compile_template(self) -> CompiledTemplate:
    """
    Unique compilation method for the OpenAITaskOperator class.
    Uses the InstructFunctionTemplate and StructuredOutputInstructionPrompt templates to create a message.
    And the _compile_input_placeholder method to create a placeholder for the input of the function.
    Returns:
        Dict[str, List[Message]]: A dictionary containing a list of messages.

    """
    instruction_template = InstructFunctionTemplate
    structured_template = StructuredOutputInstructionPrompt
    output_schema = self._compile_output_prompt(structured_template)

    messages = []
    if output_schema:
        messages.append(Message(message=output_schema, role=MessageRole.system))

    if not can_be_jinja(self.parsed.docstring_freeform):
        instruction_message = instruction_template.format(
            input_instructions=self.parsed.docstring_freeform,
            input_placeholder=self._compile_input_placeholder(),
        )
    else:
        instruction_message = self.parsed.docstring_freeform

    messages.append(Message(message=instruction_message, role=MessageRole.user))
    return messages

parse_output ¤

parse_output(output: str) -> Any

Parses the raw output from the LLM into the desired format that was set in the parsed object. Args: output: llm string output

Returns:

Type Description
Any

Any parsed output

Source code in src/declarai/operators/operator.py
111
112
113
114
115
116
117
118
119
120
def parse_output(self, output: str) -> Any:
    """
    Parses the raw output from the LLM into the desired format that was set in the parsed object.
    Args:
        output: llm string output

    Returns:
        Any parsed output
    """
    return self.parsed.parse(output)

predict ¤

predict(
    *,
    llm_params: Optional[LLMParamsType] = None,
    **kwargs: object
) -> Union[LLMResponse, Iterator[LLMResponse]]

Executes prediction using the LLM. It first compiles the prompts using the compile method, and then executes the LLM with the compiled prompts and the llm_params. Args: llm_params: The parameters that are passed during runtime. If provided, they will override the ones provided during initialization. **kwargs: The keyword arguments to pass to the compile method. Used to format the prompts placeholders.

Returns:

Type Description
Union[LLMResponse, Iterator[LLMResponse]]

The response from the LLM

Source code in src/declarai/operators/operator.py
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
def predict(
    self, *, llm_params: Optional[LLMParamsType] = None, **kwargs: object
) -> Union[LLMResponse, Iterator[LLMResponse]]:
    """
    Executes prediction using the LLM.
    It first compiles the prompts using the `compile` method, and then executes the LLM with the compiled prompts and the llm_params.
    Args:
        llm_params: The parameters that are passed during runtime. If provided, they will override the ones provided during initialization.
        **kwargs: The keyword arguments to pass to the `compile` method. Used to format the prompts placeholders.

    Returns:
        The response from the LLM
    """
    llm_params = llm_params or self.llm_params  # Order is important -
    if self.streaming is not None:
        llm_params["stream"] = self.streaming  # streaming should be the last param
    # provided params during execution should override the ones provided during initialization
    return self.llm.predict(**self.compile(**kwargs), **llm_params)

OpenAITaskOperator ¤

Bases: BaseOperator

Task implementation for openai operator. This is a child of the BaseOperator class. See the BaseOperator class for further documentation. Implements the compile method which compiles a parsed function into a message. Uses the OpenAILLM to generate a response based on the given template.

Attributes:

Name Type Description
llm OpenAILLM

OpenAILLM

Methods:

Name Description
_compile_input_placeholder

Creates a placeholder for the input of the function.

compile

Implements the compile method of the BaseOperator class.

compile_template

Unique compilation method for the OpenAITaskOperator class.

parse_output

Parses the raw output from the LLM into the desired format that was set in the parsed object.

predict

Executes prediction using the LLM.

Attributes:

Name Type Description
streaming bool

Returns whether the operator is streaming or not

streaming property ¤

streaming: bool

Returns whether the operator is streaming or not Returns:

_compile_input_placeholder ¤

_compile_input_placeholder() -> str

Creates a placeholder for the input of the function. The input format is based on the function input schema.

Example

for example a function signature of:

def foo(a: int, b: str, c: float = 1.0):

will result in the following placeholder:

    Inputs:
    a: {a}
    b: {b}
    c: {c}

Source code in src/declarai/operators/openai_operators/task_operator.py
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
def _compile_input_placeholder(self) -> str:
    """
    Creates a placeholder for the input of the function.
    The input format is based on the function input schema.

    !!! example
        for example a function signature of:
            ```py
            def foo(a: int, b: str, c: float = 1.0):
            ```

        will result in the following placeholder:
        ```md
            Inputs:
            a: {a}
            b: {b}
            c: {c}
        ```
    """
    inputs = ""

    if not self.parsed.signature_kwargs.keys():
        return inputs

    for i, param in enumerate(self.parsed.signature_kwargs.keys()):
        if i == 0:
            inputs += INPUT_LINE_TEMPLATE.format(param=param)
            continue
        inputs += NEW_LINE_INPUT_LINE_TEMPLATE.format(param=param)

    return INPUTS_TEMPLATE.format(inputs=inputs)

compile ¤

compile(**kwargs) -> CompiledTemplate

Implements the compile method of the BaseOperator class. Args: **kwargs:

Returns:

Type Description
CompiledTemplate

Dict[str, List[Message]]: A dictionary containing a list of messages.

Source code in src/declarai/operators/operator.py
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
def compile(self, **kwargs) -> CompiledTemplate:
    """
    Implements the compile method of the BaseOperator class.
    Args:
        **kwargs:

    Returns:
        Dict[str, List[Message]]: A dictionary containing a list of messages.

    """
    template = self.compile_template()
    if kwargs:
        template[-1].message = format_prompt_msg(
            _string=template[-1].message, **kwargs
        )
    return {"messages": template}

compile_template ¤

compile_template() -> CompiledTemplate

Unique compilation method for the OpenAITaskOperator class. Uses the InstructFunctionTemplate and StructuredOutputInstructionPrompt templates to create a message. And the _compile_input_placeholder method to create a placeholder for the input of the function. Returns: Dict[str, List[Message]]: A dictionary containing a list of messages.

Source code in src/declarai/operators/openai_operators/task_operator.py
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
def compile_template(self) -> CompiledTemplate:
    """
    Unique compilation method for the OpenAITaskOperator class.
    Uses the InstructFunctionTemplate and StructuredOutputInstructionPrompt templates to create a message.
    And the _compile_input_placeholder method to create a placeholder for the input of the function.
    Returns:
        Dict[str, List[Message]]: A dictionary containing a list of messages.

    """
    instruction_template = InstructFunctionTemplate
    structured_template = StructuredOutputInstructionPrompt
    output_schema = self._compile_output_prompt(structured_template)

    messages = []
    if output_schema:
        messages.append(Message(message=output_schema, role=MessageRole.system))

    if not can_be_jinja(self.parsed.docstring_freeform):
        instruction_message = instruction_template.format(
            input_instructions=self.parsed.docstring_freeform,
            input_placeholder=self._compile_input_placeholder(),
        )
    else:
        instruction_message = self.parsed.docstring_freeform

    messages.append(Message(message=instruction_message, role=MessageRole.user))
    return messages

parse_output ¤

parse_output(output: str) -> Any

Parses the raw output from the LLM into the desired format that was set in the parsed object. Args: output: llm string output

Returns:

Type Description
Any

Any parsed output

Source code in src/declarai/operators/operator.py
111
112
113
114
115
116
117
118
119
120
def parse_output(self, output: str) -> Any:
    """
    Parses the raw output from the LLM into the desired format that was set in the parsed object.
    Args:
        output: llm string output

    Returns:
        Any parsed output
    """
    return self.parsed.parse(output)

predict ¤

predict(
    *,
    llm_params: Optional[LLMParamsType] = None,
    **kwargs: object
) -> Union[LLMResponse, Iterator[LLMResponse]]

Executes prediction using the LLM. It first compiles the prompts using the compile method, and then executes the LLM with the compiled prompts and the llm_params. Args: llm_params: The parameters that are passed during runtime. If provided, they will override the ones provided during initialization. **kwargs: The keyword arguments to pass to the compile method. Used to format the prompts placeholders.

Returns:

Type Description
Union[LLMResponse, Iterator[LLMResponse]]

The response from the LLM

Source code in src/declarai/operators/operator.py
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
def predict(
    self, *, llm_params: Optional[LLMParamsType] = None, **kwargs: object
) -> Union[LLMResponse, Iterator[LLMResponse]]:
    """
    Executes prediction using the LLM.
    It first compiles the prompts using the `compile` method, and then executes the LLM with the compiled prompts and the llm_params.
    Args:
        llm_params: The parameters that are passed during runtime. If provided, they will override the ones provided during initialization.
        **kwargs: The keyword arguments to pass to the `compile` method. Used to format the prompts placeholders.

    Returns:
        The response from the LLM
    """
    llm_params = llm_params or self.llm_params  # Order is important -
    if self.streaming is not None:
        llm_params["stream"] = self.streaming  # streaming should be the last param
    # provided params during execution should override the ones provided during initialization
    return self.llm.predict(**self.compile(**kwargs), **llm_params)