AI-Studio/app/MindWork AI Studio/Provider/SelfHosted
2025-07-24 17:23:30 +02:00
..
ChatRequest.cs Removed the MaxTokens parameter, because vLLM doesnt support setting the value to -1. 2025-07-24 17:23:30 +02:00
Host.cs Removed the MaxTokens parameter, because vLLM doesnt support setting the value to -1. 2025-07-24 17:23:30 +02:00
HostExtensions.cs Removed the MaxTokens parameter, because vLLM doesnt support setting the value to -1. 2025-07-24 17:23:30 +02:00
Message.cs Implement support for self-hosted and local LLMs (#20) 2024-07-03 20:31:04 +02:00
ModelsResponse.cs Implement support for self-hosted and local LLMs (#20) 2024-07-03 20:31:04 +02:00
ProviderSelfHosted.cs Removed the MaxTokens parameter, because vLLM doesnt support setting the value to -1. 2025-07-24 17:23:30 +02:00