POST /projects
name: (required) project nameembeddings: (optional) embeddings name to be used in the project [RAG]llm: LLM name to be used in the projecttype: project type (rag, vision, router, inference)vectorstore: Vector name to be used [RAG]project: project nameGET /projects
projects: array with all projects, follow the same structure as Get a ProjectGET /projects/{projectname}
projectname: (required) Project namename: Project name, normalized used to identify the project.human_name: Human project name, unnormalized, human friendly.human_description: Project description, human friendly.type: Project typellm: LLM to be used by the projectchunks: How many chunks were ingested [RAG]embeddings: embeddings model to be used [RAG]k: K value [RAG]score: Score cutoff [RAG]vectorstore: Vectorstore name [RAG]system: System messagecensorship: Censhorship message to be sent when Retrival process fails [RAG]llm_rerank: LLM rerank [RAG]colbert_rerank: Colbert rerank [RAG]tools: tools to be used in the question [AGENT]tables: tables to be used in the question [RAGSQL]connection: connection string to the database [RAGSQL]entrances: array with routes [ROUTER]llm_type: LLM typellm_privacy: LLM privacyPOST /users
username: (required) username for the new userpassword: (required) password for the new useris_admin: (optional) boolean to specify if the user is an adminis_private: (optional) boolean to specify if the user is locked to local/private LLMs onlyusername: user questionis_admin: answeris_private: array with sources used to generate the answerGET /users/{username}
username: (required) usernameusername: user questionis_admin: answeris_private: array with sources used to generate the answerprojects: array with projects the user has access toPOST /users/{username}/apikey
username: (required) usernameapi_key: user’s API key, keep it safe if lost new one will need to be generatedGET /users
users: array with all users, follow the same structure as Get an userGET /llms/{llmname}
llmname: (required) LLM namename: LLM Nameclass_name: Class nameoptions: String containing the options used to initialize the LLMprivacy: public or private (local)description: Descriptiontype: qa, chat, visionGET /llms
llms: array with all LLMs, follow the same structure as Get an LLMPOST /projects/{projectName}/question
projectName: (required) project namequestion: (required) user messagesystem: (optional) system message, if not provided project’s system message will be usedstream: (optional) boolean to specify if this completion is streamedAdvanced parameters (optional):
score: (optional) cutoff score, if not provided project’s cutoff score will be used [RAG]k: (optional) k value, if not provided project’s k value will be used [RAG]colbert_rerank: (optional) boolean to specify if colbert reranking is enabled [RAG]llm_rerank: (optional) boolean to specify if llm reranking is enabled [RAG]eval: (optional) rag evaluation [RAG]tables: (optional) tables to be used in the question [RAGSQL]negative: (optional) negative prompt [VISION]image: (optional) image [VISION]question: user questionanswer: answersources: array with sources used to generate the answertype: type (inference, rag, vision, ragsql, etc)tokens.input: number of tokens used in the questiontokens.output: number of tokens used in the answerimage: image [VISION] [ROUTER]evaluation.reason: if eval was provided, the reason for the score [RAG]evaluation.score: if eval was provided, the score [RAG]POST /image/{generator}/generate
prompt: (required) project nameimage: image [VISION] [ROUTER]GET /image
generators: array with all image generators names