Jobs
Get Available LLM Providers
API Documentation
General
- POSTAdd LLM Provider
- POSTAdd Ollama Models
- POSTChange Node Name
- GETGet Default Embedding Model
- POSTSet Default Embedding Model
- GETDownload File from Inbox
- GETAPI Health Check
- POSTInitial Registration
- GETCheck Pristine State
- GETList Files in Inbox
- GETGet Local Processing Preference
- POSTSet Local Processing Preference
- POSTModify LLM Provider
- GETGet Public Keys
- POSTRemove LLM Provider
- GETScan Ollama Models
- POSTStop LLM Process
- GETGet Supported Embedding Models
- POSTUpdate Supported Embedding Models
Jobs
- POSTAdd File to Inbox
- GETGet All Inboxes
- GETGet Available LLM Providers
- POSTChange Job LLM Provider
- POSTCreate Files Inbox
- POSTCreate Job
- GETGet Job Configuration
- GETGet Job Scope
- POSTSend Job Message
- POSTGet Last Messages
- POSTGet Last Messages with Branches
- POSTRetry Message
- POSTUpdate Job Configuration
- POSTUpdate Job Scope
- POSTUpdate Smart Inbox Name
Subscriptions
- POSTGet Available Shared Items
- POSTGet Open Shared Items
- POSTCreate Shareable Folder
- POSTGenerate Free Subscription Links
- POSTGet Latest Notifications
- POSTList My Subscribers
- POSTList My Subscriptions
- POSTGet Notifications Before Timestamp
- POSTSubscribe to Shared Folder
- POSTUnshare Folder
- POSTUnsubscribe
- POSTUpdate Shareable Folder
Tool Offerings
Vector File System
Jobs
Get Available LLM Providers
GET
/
v2
/
available_llm_providers
curl --request GET \
--url http://127.0.0.1:9550/v2/available_llm_providers
[
{
"allowed_message_senders": [
"<string>"
],
"api_key": "<string>",
"external_url": "<string>",
"full_identity_name": {
"full_name": "<string>",
"node_name": "<string>",
"profile_name": "<string>",
"subidentity_name": "<string>",
"subidentity_type": "Agent"
},
"id": "<string>",
"model": {
"OpenAI": {
"model_type": "<string>"
}
},
"perform_locally": true,
"storage_bucket_permissions": [
"<string>"
],
"toolkit_permissions": [
"<string>"
]
}
]
Fetch a list of all available Language Model (LLM) providers that can be used for jobs.
Response
200
application/json
Successfully retrieved available LLM providers
Was this page helpful?
curl --request GET \
--url http://127.0.0.1:9550/v2/available_llm_providers
[
{
"allowed_message_senders": [
"<string>"
],
"api_key": "<string>",
"external_url": "<string>",
"full_identity_name": {
"full_name": "<string>",
"node_name": "<string>",
"profile_name": "<string>",
"subidentity_name": "<string>",
"subidentity_type": "Agent"
},
"id": "<string>",
"model": {
"OpenAI": {
"model_type": "<string>"
}
},
"perform_locally": true,
"storage_bucket_permissions": [
"<string>"
],
"toolkit_permissions": [
"<string>"
]
}
]