Call prompt

Call prompt with your variables in prompt_context. You can override params saved in the prompt and specifiy file urls

Recent Requests
Log in to see full request history
TimeStatusUser Agent
Retrieving recent requests…
LoadingLoading…
Path Params
string
required
Body Params
string | null

Optional model override (e.g., 'openai/gpt-4o'). If not provided, the model specified in the prompt will be used.

prompt_context
object | null

Variables to inject into the prompt template.

Example

{
    "my_variable": "some_value",
}
params
object | null

Optional params override. If not provided, the default params in the prompt will be used

Params that are passed on to the llm request. See llm chat docs for more details.

Example

{
  "temperature": 0.7,
  "max_tokens": 100
}
file_urls
array of strings

Optional list of URLs to images or other files that should be included with the prompt for multimodal models. Files are not supported by all models. Overrides the file urls set on the prompt.

file_urls
Response

Language
Credentials
Bearer
JWT
LoadingLoading…
Response
Click Try It! to start a request and see the response here! Or choose an example:
application/json