LLM Response endpoint
The LLM Response endpoint plugin acts as a backend that sends a predefined prompt to an LLM and returns the raw generations. Unlike the Response generator, this plugin supports Otoroshi Expression Language in the prompt, allowing you to dynamically inject request data, route information, API key metadata, and more into the prompt.
How it works
- A request arrives at the route
- The plugin applies Expression Language substitution to the configured prompt, injecting request context (URI, headers, query params, etc.)
- The resolved prompt is sent to the LLM provider
- The raw LLM generations are returned as JSON
Plugin configuration
{
"enabled": true,
"plugin": "cp:otoroshi_plugins.com.cloud.apim.otoroshi.extensions.aigateway.plugins.LlmResponseEndpoint",
"config": {
"ref": "provider-entity-id",
"prompt": "The user is requesting ${req.uri}. Generate a helpful response based on this path.",
"prompt_ref": null,
"context_ref": null
}
}
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
ref | string | "" | LLM Provider entity ID |
prompt | string | "" | The prompt to send to the LLM. Supports Expression Language |
prompt_ref | string | null | Reference to a stored prompt entity (EL is also applied) |
context_ref | string | null | Reference to a stored context entity (EL is also applied) |
Response format
{
"generations": [
{
"message": {
"role": "assistant",
"content": "..."
}
}
]
}
Expression Language examples
You can use Otoroshi's Expression Language to inject dynamic values into the prompt:
| Expression | Description |
|---|---|
${req.uri} | Request URI |
${req.method} | HTTP method |
${req.header.X-Custom} | Request header value |
${req.query.param} | Query parameter value |
${route.id} | Route ID |
${route.name} | Route name |
${apikey.id} | API key ID |
${apikey.name} | API key name |
${apikey.metadata.key} | API key metadata value |
Example: dynamic documentation endpoint
{
"ref": "provider_openai",
"prompt": "Generate API documentation for the endpoint ${req.method} ${req.uri}. The API belongs to ${route.name}. Return the documentation in markdown format."
}
Example: personalized response based on API key
{
"ref": "provider_openai",
"prompt": "The user ${apikey.metadata.username} (plan: ${apikey.metadata.plan}) is requesting ${req.uri}. Generate a personalized welcome message."
}