Model Access & Credentials
To integrate with your preferred ML/LLM model provider, we’ll need the following details:
If using OpenAI:
- OpenAI API Key
- Preferred set of OpenAI Models
- Endpoint URL (If unknown, we’ll use the default — typically
https://api.openai.com/v1/...
)
Docs: OpenAI API Quickstart
If using Azure OpenAI:
- Azure Resource Name (Optional)
- Deployment Name
- Azure API Key
- Azure Endpoint URL (e.g.,
https://<your-resource>.openai.azure.com/
)
Docs: Azure OpenAI Quickstart
If using Custom Models (please reach out to discuss):
We’ll need to understand the following:
- What inferencing API does the model use? (e.g., OpenAI-compatible, Ollama, etc.)
- Where is it hosted and how do we authenticate?
- Does it require any additional parameters when inferencing (e.g., custom headers, query params)?
Updated about 2 months ago