Skip to main content

LLM Call

The llm_call function allows you to call any LLM provider configured in Otoroshi from a workflow.

  • Function name: extensions.com.cloud-apim.llm-extension.llm_call

Parameters

ParameterTypeRequiredDescription
providerstringyesThe LLM provider id
openai_formatbooleannoReturn the response in OpenAI format (default: true)
payloadobjectyesThe payload object
payload.messagesarrayyesThe chat messages array
payload.tool_functionsarraynoList of tool function ids to use
payload.mcp_connectorsarraynoList of MCP connector ids to use
payload.wasm_toolsarraynoList of WASM tool ids to use

Output

Returns the LLM response. When openai_format is true (default), the response follows the OpenAI chat completion format. Otherwise, it returns the raw response from the extension.

Example

{
"kind": "call",
"function": "extensions.com.cloud-apim.llm-extension.llm_call",
"args": {
"provider": "provider_xxxxx",
"openai_format": true,
"payload": {
"messages": [
{
"role": "user",
"content": "Hello, world!"
}
]
}
},
"result": "llm_response"
}

Example with tool functions

{
"kind": "call",
"function": "extensions.com.cloud-apim.llm-extension.llm_call",
"args": {
"provider": "provider_xxxxx",
"payload": {
"messages": [
{
"role": "user",
"content": "What is the weather in Paris?"
}
],
"tool_functions": ["tool-function_xxxxx"],
"mcp_connectors": ["mcp-connector_xxxxx"]
}
},
"result": "llm_response"
}