Fixed several provider dialog issues (#629)
Some checks are pending
Build and Release / Read metadata (push) Waiting to run
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-apple-darwin, osx-arm64, macos-latest, aarch64-apple-darwin, dmg updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-pc-windows-msvc.exe, win-arm64, windows-latest, aarch64-pc-windows-msvc, nsis updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-unknown-linux-gnu, linux-arm64, ubuntu-22.04-arm, aarch64-unknown-linux-gnu, appimage deb updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-apple-darwin, osx-x64, macos-latest, x86_64-apple-darwin, dmg updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-pc-windows-msvc.exe, win-x64, windows-latest, x86_64-pc-windows-msvc, nsis updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-unknown-linux-gnu, linux-x64, ubuntu-22.04, x86_64-unknown-linux-gnu, appimage deb updater) (push) Blocked by required conditions
Build and Release / Prepare & create release (push) Blocked by required conditions
Build and Release / Publish release (push) Blocked by required conditions

This commit is contained in:
Thorsten Sommer 2026-01-18 17:15:18 +01:00 committed by GitHub
parent 0a504276d9
commit 3abc2bf0f1
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
22 changed files with 472 additions and 185 deletions

View File

@ -2101,6 +2101,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T14695
-- Add Embedding -- Add Embedding
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1738753945"] = "Add Embedding" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1738753945"] = "Add Embedding"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1760715963"] = "Uses the provider-configured model"
-- Are you sure you want to delete the embedding provider '{0}'? -- Are you sure you want to delete the embedding provider '{0}'?
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1825371968"] = "Are you sure you want to delete the embedding provider '{0}'?" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1825371968"] = "Are you sure you want to delete the embedding provider '{0}'?"
@ -2164,6 +2167,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T162847
-- Description -- Description
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1725856265"] = "Description" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1725856265"] = "Description"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1760715963"] = "Uses the provider-configured model"
-- Add Provider -- Add Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1806589097"] = "Add Provider" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1806589097"] = "Add Provider"
@ -2206,9 +2212,6 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T291173
-- Configured LLM Providers -- Configured LLM Providers
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3019870540"] = "Configured LLM Providers" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3019870540"] = "Configured LLM Providers"
-- as selected by provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3082210376"] = "as selected by provider"
-- Edit -- Edit
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3267849393"] = "Edit" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3267849393"] = "Edit"
@ -2266,6 +2269,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T14
-- Add transcription provider -- Add transcription provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1645238629"] = "Add transcription provider" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1645238629"] = "Add transcription provider"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1760715963"] = "Uses the provider-configured model"
-- Add Transcription Provider -- Add Transcription Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T2066315685"] = "Add Transcription Provider" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T2066315685"] = "Add Transcription Provider"
@ -3205,6 +3211,9 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T290547799"] = "Cur
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T416738168"] = "Model selection" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T416738168"] = "Model selection"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T504465522"] = "We are currently unable to communicate with the provider to load models. Please try again later."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T808120719"] = "Host"
@ -3412,12 +3421,18 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3361153305"] = "Show Expert
-- Show available models -- Show available models
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3763891899"] = "Show available models" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3763891899"] = "Show available models"
-- This host uses the model configured at the provider level. No model selection is available.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3783329915"] = "This host uses the model configured at the provider level. No model selection is available."
-- Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually. -- Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T4116737656"] = "Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually." UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T4116737656"] = "Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually."
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T416738168"] = "Model selection" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T416738168"] = "Model selection"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T504465522"] = "We are currently unable to communicate with the provider to load models. Please try again later."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T808120719"] = "Host"
@ -4633,9 +4648,15 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T2842060373"] =
-- Please enter a transcription model name. -- Please enter a transcription model name.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3703662664"] = "Please enter a transcription model name." UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3703662664"] = "Please enter a transcription model name."
-- This host uses the model configured at the provider level. No model selection is available.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3783329915"] = "This host uses the model configured at the provider level. No model selection is available."
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T416738168"] = "Model selection" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T416738168"] = "Model selection"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T504465522"] = "We are currently unable to communicate with the provider to load models. Please try again later."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T808120719"] = "Host"

View File

@ -35,7 +35,7 @@
<MudTd>@context.Num</MudTd> <MudTd>@context.Num</MudTd>
<MudTd>@context.Name</MudTd> <MudTd>@context.Name</MudTd>
<MudTd>@context.UsedLLMProvider.ToName()</MudTd> <MudTd>@context.UsedLLMProvider.ToName()</MudTd>
<MudTd>@GetEmbeddingProviderModelName(context)</MudTd> <MudTd>@this.GetEmbeddingProviderModelName(context)</MudTd>
<MudTd> <MudTd>
<MudStack Row="true" Class="mb-2 mt-2" Spacing="1" Wrap="Wrap.Wrap"> <MudStack Row="true" Class="mb-2 mt-2" Spacing="1" Wrap="Wrap.Wrap">

View File

@ -15,8 +15,12 @@ public partial class SettingsPanelEmbeddings : SettingsPanelBase
[Parameter] [Parameter]
public EventCallback<List<ConfigurationSelectData<string>>> AvailableEmbeddingProvidersChanged { get; set; } public EventCallback<List<ConfigurationSelectData<string>>> AvailableEmbeddingProvidersChanged { get; set; }
private static string GetEmbeddingProviderModelName(EmbeddingProvider provider) private string GetEmbeddingProviderModelName(EmbeddingProvider provider)
{ {
// For system models, return localized text:
if (provider.Model.IsSystemModel)
return T("Uses the provider-configured model");
const int MAX_LENGTH = 36; const int MAX_LENGTH = 36;
var modelName = provider.Model.ToString(); var modelName = provider.Model.ToString();
return modelName.Length > MAX_LENGTH ? "[...] " + modelName[^Math.Min(MAX_LENGTH, modelName.Length)..] : modelName; return modelName.Length > MAX_LENGTH ? "[...] " + modelName[^Math.Min(MAX_LENGTH, modelName.Length)..] : modelName;

View File

@ -1,6 +1,5 @@
@using AIStudio.Provider @using AIStudio.Provider
@using AIStudio.Settings @using AIStudio.Settings
@using AIStudio.Provider.SelfHosted
@inherits SettingsPanelBase @inherits SettingsPanelBase
<ExpansionPanel HeaderIcon="@Icons.Material.Filled.Layers" HeaderText="@T("Configure LLM Providers")"> <ExpansionPanel HeaderIcon="@Icons.Material.Filled.Layers" HeaderText="@T("Configure LLM Providers")">
@ -29,20 +28,7 @@
<MudTd>@context.Num</MudTd> <MudTd>@context.Num</MudTd>
<MudTd>@context.InstanceName</MudTd> <MudTd>@context.InstanceName</MudTd>
<MudTd>@context.UsedLLMProvider.ToName()</MudTd> <MudTd>@context.UsedLLMProvider.ToName()</MudTd>
<MudTd> <MudTd>@this.GetLLMProviderModelName(context)</MudTd>
@if (context.UsedLLMProvider is not LLMProviders.SELF_HOSTED)
{
@GetLLMProviderModelName(context)
}
else if (context.UsedLLMProvider is LLMProviders.SELF_HOSTED && context.Host is not Host.LLAMA_CPP)
{
@GetLLMProviderModelName(context)
}
else
{
@T("as selected by provider")
}
</MudTd>
<MudTd> <MudTd>
<MudStack Row="true" Class="mb-2 mt-2" Spacing="1" Wrap="Wrap.Wrap"> <MudStack Row="true" Class="mb-2 mt-2" Spacing="1" Wrap="Wrap.Wrap">
@if (context.IsEnterpriseConfiguration) @if (context.IsEnterpriseConfiguration)

View File

@ -134,8 +134,12 @@ public partial class SettingsPanelProviders : SettingsPanelBase
await this.MessageBus.SendMessage<bool>(this, Event.CONFIGURATION_CHANGED); await this.MessageBus.SendMessage<bool>(this, Event.CONFIGURATION_CHANGED);
} }
private static string GetLLMProviderModelName(AIStudio.Settings.Provider provider) private string GetLLMProviderModelName(AIStudio.Settings.Provider provider)
{ {
// For system models, return localized text:
if (provider.Model.IsSystemModel)
return T("Uses the provider-configured model");
const int MAX_LENGTH = 36; const int MAX_LENGTH = 36;
var modelName = provider.Model.ToString(); var modelName = provider.Model.ToString();
return modelName.Length > MAX_LENGTH ? "[...] " + modelName[^Math.Min(MAX_LENGTH, modelName.Length)..] : modelName; return modelName.Length > MAX_LENGTH ? "[...] " + modelName[^Math.Min(MAX_LENGTH, modelName.Length)..] : modelName;

View File

@ -32,7 +32,7 @@
<MudTd>@context.Num</MudTd> <MudTd>@context.Num</MudTd>
<MudTd>@context.Name</MudTd> <MudTd>@context.Name</MudTd>
<MudTd>@context.UsedLLMProvider.ToName()</MudTd> <MudTd>@context.UsedLLMProvider.ToName()</MudTd>
<MudTd>@GetTranscriptionProviderModelName(context)</MudTd> <MudTd>@this.GetTranscriptionProviderModelName(context)</MudTd>
<MudTd> <MudTd>
<MudStack Row="true" Class="mb-2 mt-2" Spacing="1" Wrap="Wrap.Wrap"> <MudStack Row="true" Class="mb-2 mt-2" Spacing="1" Wrap="Wrap.Wrap">

View File

@ -15,8 +15,12 @@ public partial class SettingsPanelTranscription : SettingsPanelBase
[Parameter] [Parameter]
public EventCallback<List<ConfigurationSelectData<string>>> AvailableTranscriptionProvidersChanged { get; set; } public EventCallback<List<ConfigurationSelectData<string>>> AvailableTranscriptionProvidersChanged { get; set; }
private static string GetTranscriptionProviderModelName(TranscriptionProvider provider) private string GetTranscriptionProviderModelName(TranscriptionProvider provider)
{ {
// For system models, return localized text:
if (provider.Model.IsSystemModel)
return T("Uses the provider-configured model");
const int MAX_LENGTH = 36; const int MAX_LENGTH = 36;
var modelName = provider.Model.ToString(); var modelName = provider.Model.ToString();
return modelName.Length > MAX_LENGTH ? "[...] " + modelName[^Math.Min(MAX_LENGTH, modelName.Length)..] : modelName; return modelName.Length > MAX_LENGTH ? "[...] " + modelName[^Math.Min(MAX_LENGTH, modelName.Length)..] : modelName;

View File

@ -242,7 +242,7 @@ public partial class VoiceRecorder : MSGComponentBase
{ {
this.Logger.LogWarning( this.Logger.LogWarning(
"The configured transcription provider '{ProviderName}' has a confidence level of '{ProviderLevel}', which is below the minimum required level of '{MinimumLevel}'.", "The configured transcription provider '{ProviderName}' has a confidence level of '{ProviderLevel}', which is below the minimum required level of '{MinimumLevel}'.",
transcriptionProviderSettings.Name, transcriptionProviderSettings.UsedLLMProvider,
providerConfidence.Level, providerConfidence.Level,
minimumLevel); minimumLevel);
await this.MessageBus.SendError(new(Icons.Material.Filled.VoiceChat, this.T("The configured transcription provider does not meet the minimum confidence level."))); await this.MessageBus.SendError(new(Icons.Material.Filled.VoiceChat, this.T("The configured transcription provider does not meet the minimum confidence level.")));
@ -259,7 +259,7 @@ public partial class VoiceRecorder : MSGComponentBase
} }
// Call the transcription API: // Call the transcription API:
this.Logger.LogInformation("Starting transcription with provider '{ProviderName}' and model '{ModelName}'.", transcriptionProviderSettings.Name, transcriptionProviderSettings.Model.DisplayName); this.Logger.LogInformation("Starting transcription with provider '{ProviderName}' and model '{ModelName}'.", transcriptionProviderSettings.UsedLLMProvider, transcriptionProviderSettings.Model.ToString());
var transcribedText = await provider.TranscribeAudioAsync(transcriptionProviderSettings.Model, this.finalRecordingPath, this.SettingsManager); var transcribedText = await provider.TranscribeAudioAsync(transcriptionProviderSettings.Model, this.finalRecordingPath, this.SettingsManager);
if (string.IsNullOrWhiteSpace(transcribedText)) if (string.IsNullOrWhiteSpace(transcribedText))

View File

@ -44,7 +44,7 @@
@if (this.DataLLMProvider.IsHostNeeded()) @if (this.DataLLMProvider.IsHostNeeded())
{ {
<MudSelect @bind-Value="@this.DataHost" Label="@T("Host")" Class="mb-3" OpenIcon="@Icons.Material.Filled.ExpandMore" AdornmentColor="Color.Info" Adornment="Adornment.Start" Validation="@this.providerValidation.ValidatingHost"> <MudSelect T="Host" Value="@this.DataHost" ValueChanged="@this.OnHostChanged" Label="@T("Host")" Class="mb-3" OpenIcon="@Icons.Material.Filled.ExpandMore" AdornmentColor="Color.Info" Adornment="Adornment.Start" Validation="@this.providerValidation.ValidatingHost">
@foreach (Host host in Enum.GetValues(typeof(Host))) @foreach (Host host in Enum.GetValues(typeof(Host)))
{ {
if (host.IsEmbeddingSupported()) if (host.IsEmbeddingSupported())
@ -101,6 +101,12 @@
} }
} }
</MudStack> </MudStack>
@if (!string.IsNullOrWhiteSpace(this.dataLoadingModelsIssue))
{
<MudAlert Severity="Severity.Error" Class="mt-3">
@this.dataLoadingModelsIssue
</MudAlert>
}
</MudField> </MudField>
@* ReSharper disable once CSharpWarnings::CS8974 *@ @* ReSharper disable once CSharpWarnings::CS8974 *@

View File

@ -72,6 +72,9 @@ public partial class EmbeddingProviderDialog : MSGComponentBase, ISecretId
[Inject] [Inject]
private RustService RustService { get; init; } = null!; private RustService RustService { get; init; } = null!;
[Inject]
private ILogger<EmbeddingProviderDialog> Logger { get; init; } = null!;
private static readonly Dictionary<string, object?> SPELLCHECK_ATTRIBUTES = new(); private static readonly Dictionary<string, object?> SPELLCHECK_ATTRIBUTES = new();
/// <summary> /// <summary>
@ -85,6 +88,7 @@ public partial class EmbeddingProviderDialog : MSGComponentBase, ISecretId
private string dataManuallyModel = string.Empty; private string dataManuallyModel = string.Empty;
private string dataAPIKeyStorageIssue = string.Empty; private string dataAPIKeyStorageIssue = string.Empty;
private string dataEditingPreviousInstanceName = string.Empty; private string dataEditingPreviousInstanceName = string.Empty;
private string dataLoadingModelsIssue = string.Empty;
// We get the form reference from Blazor code to validate it manually: // We get the form reference from Blazor code to validate it manually:
private MudForm form = null!; private MudForm form = null!;
@ -102,6 +106,7 @@ public partial class EmbeddingProviderDialog : MSGComponentBase, ISecretId
GetPreviousInstanceName = () => this.dataEditingPreviousInstanceName, GetPreviousInstanceName = () => this.dataEditingPreviousInstanceName,
GetUsedInstanceNames = () => this.UsedInstanceNames, GetUsedInstanceNames = () => this.UsedInstanceNames,
GetHost = () => this.DataHost, GetHost = () => this.DataHost,
IsModelProvidedManually = () => this.DataLLMProvider is LLMProviders.SELF_HOSTED && this.DataHost is Host.OLLAMA,
}; };
} }
@ -209,6 +214,15 @@ public partial class EmbeddingProviderDialog : MSGComponentBase, ISecretId
await this.form.Validate(); await this.form.Validate();
this.dataAPIKeyStorageIssue = string.Empty; this.dataAPIKeyStorageIssue = string.Empty;
// Manually validate the model selection (needed when no models are loaded
// and the MudSelect is not rendered):
var modelValidationError = this.providerValidation.ValidatingModel(this.DataModel);
if (!string.IsNullOrWhiteSpace(modelValidationError))
{
this.dataIssues = [..this.dataIssues, modelValidationError];
this.dataIsValid = false;
}
// When the data is not valid, we don't store it: // When the data is not valid, we don't store it:
if (!this.dataIsValid) if (!this.dataIsValid)
return; return;
@ -251,13 +265,26 @@ public partial class EmbeddingProviderDialog : MSGComponentBase, ISecretId
} }
} }
private void OnHostChanged(Host selectedHost)
{
// When the host changes, reset the model selection state:
this.DataHost = selectedHost;
this.DataModel = default;
this.dataManuallyModel = string.Empty;
this.availableModels.Clear();
this.dataLoadingModelsIssue = string.Empty;
}
private async Task ReloadModels() private async Task ReloadModels()
{ {
this.dataLoadingModelsIssue = string.Empty;
var currentEmbeddingProviderSettings = this.CreateEmbeddingProviderSettings(); var currentEmbeddingProviderSettings = this.CreateEmbeddingProviderSettings();
var provider = currentEmbeddingProviderSettings.CreateProvider(); var provider = currentEmbeddingProviderSettings.CreateProvider();
if (provider is NoProvider) if (provider is NoProvider)
return; return;
try
{
var models = await provider.GetEmbeddingModels(this.dataAPIKey); var models = await provider.GetEmbeddingModels(this.dataAPIKey);
// Order descending by ID means that the newest models probably come first: // Order descending by ID means that the newest models probably come first:
@ -266,6 +293,12 @@ public partial class EmbeddingProviderDialog : MSGComponentBase, ISecretId
this.availableModels.Clear(); this.availableModels.Clear();
this.availableModels.AddRange(orderedModels); this.availableModels.AddRange(orderedModels);
} }
catch (Exception e)
{
this.Logger.LogError($"Failed to load models from provider '{this.DataLLMProvider}' (host={this.DataHost}, hostname='{this.DataHostname}'): {e.Message}");
this.dataLoadingModelsIssue = T("We are currently unable to communicate with the provider to load models. Please try again later.");
}
}
private string APIKeyText => this.DataLLMProvider switch private string APIKeyText => this.DataLLMProvider switch
{ {

View File

@ -41,7 +41,7 @@
@if (this.DataLLMProvider.IsHostNeeded()) @if (this.DataLLMProvider.IsHostNeeded())
{ {
<MudSelect @bind-Value="@this.DataHost" Label="@T("Host")" Class="mb-3" OpenIcon="@Icons.Material.Filled.ExpandMore" AdornmentColor="Color.Info" Adornment="Adornment.Start" Validation="@this.providerValidation.ValidatingHost"> <MudSelect T="Host" Value="@this.DataHost" ValueChanged="@this.OnHostChanged" Label="@T("Host")" Class="mb-3" OpenIcon="@Icons.Material.Filled.ExpandMore" AdornmentColor="Color.Info" Adornment="Adornment.Start" Validation="@this.providerValidation.ValidatingHost">
@foreach (Host host in Enum.GetValues(typeof(Host))) @foreach (Host host in Enum.GetValues(typeof(Host)))
{ {
@if (host.IsChatSupported()) @if (host.IsChatSupported())
@ -71,6 +71,8 @@
@* ReSharper restore Asp.Entity *@ @* ReSharper restore Asp.Entity *@
} }
@if (!this.DataLLMProvider.IsLLMModelSelectionHidden(this.DataHost))
{
<MudField FullWidth="true" Label="@T("Model selection")" Variant="Variant.Outlined" Class="mb-3"> <MudField FullWidth="true" Label="@T("Model selection")" Variant="Variant.Outlined" Class="mb-3">
<MudStack Row="@true" AlignItems="AlignItems.Center" StretchItems="StretchItems.End"> <MudStack Row="@true" AlignItems="AlignItems.Center" StretchItems="StretchItems.End">
@if (this.DataLLMProvider.IsLLMModelProvidedManually()) @if (this.DataLLMProvider.IsLLMModelProvidedManually())
@ -116,7 +118,22 @@
} }
} }
</MudStack> </MudStack>
@if (!string.IsNullOrWhiteSpace(this.dataLoadingModelsIssue))
{
<MudAlert Severity="Severity.Error" Class="mt-3">
@this.dataLoadingModelsIssue
</MudAlert>
}
</MudField> </MudField>
}
else
{
<MudField FullWidth="true" Label="@T("Model selection")" Variant="Variant.Outlined" Class="mb-3">
<MudText Typo="Typo.body1">
@T("This host uses the model configured at the provider level. No model selection is available.")
</MudText>
</MudField>
}
@* ReSharper disable once CSharpWarnings::CS8974 *@ @* ReSharper disable once CSharpWarnings::CS8974 *@
<MudTextField <MudTextField

View File

@ -84,6 +84,9 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
[Inject] [Inject]
private RustService RustService { get; init; } = null!; private RustService RustService { get; init; } = null!;
[Inject]
private ILogger<ProviderDialog> Logger { get; init; } = null!;
private static readonly Dictionary<string, object?> SPELLCHECK_ATTRIBUTES = new(); private static readonly Dictionary<string, object?> SPELLCHECK_ATTRIBUTES = new();
/// <summary> /// <summary>
@ -97,6 +100,7 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
private string dataManuallyModel = string.Empty; private string dataManuallyModel = string.Empty;
private string dataAPIKeyStorageIssue = string.Empty; private string dataAPIKeyStorageIssue = string.Empty;
private string dataEditingPreviousInstanceName = string.Empty; private string dataEditingPreviousInstanceName = string.Empty;
private string dataLoadingModelsIssue = string.Empty;
private bool showExpertSettings; private bool showExpertSettings;
// We get the form reference from Blazor code to validate it manually: // We get the form reference from Blazor code to validate it manually:
@ -115,25 +119,36 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
GetPreviousInstanceName = () => this.dataEditingPreviousInstanceName, GetPreviousInstanceName = () => this.dataEditingPreviousInstanceName,
GetUsedInstanceNames = () => this.UsedInstanceNames, GetUsedInstanceNames = () => this.UsedInstanceNames,
GetHost = () => this.DataHost, GetHost = () => this.DataHost,
IsModelProvidedManually = () => this.DataLLMProvider.IsLLMModelProvidedManually(),
}; };
} }
private AIStudio.Settings.Provider CreateProviderSettings() private AIStudio.Settings.Provider CreateProviderSettings()
{ {
var cleanedHostname = this.DataHostname.Trim(); var cleanedHostname = this.DataHostname.Trim();
// Determine the model based on the provider and host configuration:
Model model;
if (this.DataLLMProvider.IsLLMModelSelectionHidden(this.DataHost))
{
// Use system model placeholder for hosts that don't support model selection (e.g., llama.cpp):
model = Model.SYSTEM_MODEL;
}
else if (this.DataLLMProvider is LLMProviders.FIREWORKS or LLMProviders.HUGGINGFACE)
{
// These providers require manual model entry:
model = new Model(this.dataManuallyModel, null);
}
else
model = this.DataModel;
return new() return new()
{ {
Num = this.DataNum, Num = this.DataNum,
Id = this.DataId, Id = this.DataId,
InstanceName = this.DataInstanceName, InstanceName = this.DataInstanceName,
UsedLLMProvider = this.DataLLMProvider, UsedLLMProvider = this.DataLLMProvider,
Model = model,
Model = this.DataLLMProvider switch
{
LLMProviders.FIREWORKS or LLMProviders.HUGGINGFACE => new Model(this.dataManuallyModel, null),
_ => this.DataModel
},
IsSelfHosted = this.DataLLMProvider is LLMProviders.SELF_HOSTED, IsSelfHosted = this.DataLLMProvider is LLMProviders.SELF_HOSTED,
IsEnterpriseConfiguration = false, IsEnterpriseConfiguration = false,
Hostname = cleanedHostname.EndsWith('/') ? cleanedHostname[..^1] : cleanedHostname, Hostname = cleanedHostname.EndsWith('/') ? cleanedHostname[..^1] : cleanedHostname,
@ -223,6 +238,15 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
if (!string.IsNullOrWhiteSpace(this.dataAPIKeyStorageIssue)) if (!string.IsNullOrWhiteSpace(this.dataAPIKeyStorageIssue))
this.dataAPIKeyStorageIssue = string.Empty; this.dataAPIKeyStorageIssue = string.Empty;
// Manually validate the model selection (needed when no models are loaded
// and the MudSelect is not rendered):
var modelValidationError = this.providerValidation.ValidatingModel(this.DataModel);
if (!string.IsNullOrWhiteSpace(modelValidationError))
{
this.dataIssues = [..this.dataIssues, modelValidationError];
this.dataIsValid = false;
}
// When the data is not valid, we don't store it: // When the data is not valid, we don't store it:
if (!this.dataIsValid) if (!this.dataIsValid)
return; return;
@ -265,13 +289,26 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
} }
} }
private void OnHostChanged(Host selectedHost)
{
// When the host changes, reset the model selection state:
this.DataHost = selectedHost;
this.DataModel = default;
this.dataManuallyModel = string.Empty;
this.availableModels.Clear();
this.dataLoadingModelsIssue = string.Empty;
}
private async Task ReloadModels() private async Task ReloadModels()
{ {
this.dataLoadingModelsIssue = string.Empty;
var currentProviderSettings = this.CreateProviderSettings(); var currentProviderSettings = this.CreateProviderSettings();
var provider = currentProviderSettings.CreateProvider(); var provider = currentProviderSettings.CreateProvider();
if (provider is NoProvider) if (provider is NoProvider)
return; return;
try
{
var models = await provider.GetTextModels(this.dataAPIKey); var models = await provider.GetTextModels(this.dataAPIKey);
// Order descending by ID means that the newest models probably come first: // Order descending by ID means that the newest models probably come first:
@ -280,6 +317,12 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
this.availableModels.Clear(); this.availableModels.Clear();
this.availableModels.AddRange(orderedModels); this.availableModels.AddRange(orderedModels);
} }
catch (Exception e)
{
this.Logger.LogError($"Failed to load models from provider '{this.DataLLMProvider}' (host={this.DataHost}, hostname='{this.DataHostname}'): {e.Message}");
this.dataLoadingModelsIssue = T("We are currently unable to communicate with the provider to load models. Please try again later.");
}
}
private string APIKeyText => this.DataLLMProvider switch private string APIKeyText => this.DataLLMProvider switch
{ {

View File

@ -44,7 +44,7 @@
@if (this.DataLLMProvider.IsHostNeeded()) @if (this.DataLLMProvider.IsHostNeeded())
{ {
<MudSelect @bind-Value="@this.DataHost" Label="@T("Host")" Class="mb-3" OpenIcon="@Icons.Material.Filled.ExpandMore" AdornmentColor="Color.Info" Adornment="Adornment.Start" Validation="@this.providerValidation.ValidatingHost"> <MudSelect T="Host" Value="@this.DataHost" ValueChanged="@this.OnHostChanged" Label="@T("Host")" Class="mb-3" OpenIcon="@Icons.Material.Filled.ExpandMore" AdornmentColor="Color.Info" Adornment="Adornment.Start" Validation="@this.providerValidation.ValidatingHost">
@foreach (Host host in Enum.GetValues(typeof(Host))) @foreach (Host host in Enum.GetValues(typeof(Host)))
{ {
if (host.IsTranscriptionSupported()) if (host.IsTranscriptionSupported())
@ -57,6 +57,8 @@
</MudSelect> </MudSelect>
} }
@if (!this.DataLLMProvider.IsTranscriptionModelSelectionHidden(this.DataHost))
{
<MudField FullWidth="true" Label="@T("Model selection")" Variant="Variant.Outlined" Class="mb-3"> <MudField FullWidth="true" Label="@T("Model selection")" Variant="Variant.Outlined" Class="mb-3">
<MudStack Row="@true" AlignItems="AlignItems.Center" StretchItems="StretchItems.End"> <MudStack Row="@true" AlignItems="AlignItems.Center" StretchItems="StretchItems.End">
@if (this.DataLLMProvider.IsTranscriptionModelProvidedManually(this.DataHost)) @if (this.DataLLMProvider.IsTranscriptionModelProvidedManually(this.DataHost))
@ -101,7 +103,22 @@
} }
} }
</MudStack> </MudStack>
@if (!string.IsNullOrWhiteSpace(this.dataLoadingModelsIssue))
{
<MudAlert Severity="Severity.Error" Class="mt-3">
@this.dataLoadingModelsIssue
</MudAlert>
}
</MudField> </MudField>
}
else
{
<MudField FullWidth="true" Label="@T("Model selection")" Variant="Variant.Outlined" Class="mb-3">
<MudText Typo="Typo.body1">
@T("This host uses the model configured at the provider level. No model selection is available.")
</MudText>
</MudField>
}
@* ReSharper disable once CSharpWarnings::CS8974 *@ @* ReSharper disable once CSharpWarnings::CS8974 *@
<MudTextField <MudTextField

View File

@ -72,6 +72,9 @@ public partial class TranscriptionProviderDialog : MSGComponentBase, ISecretId
[Inject] [Inject]
private RustService RustService { get; init; } = null!; private RustService RustService { get; init; } = null!;
[Inject]
private ILogger<TranscriptionProviderDialog> Logger { get; init; } = null!;
private static readonly Dictionary<string, object?> SPELLCHECK_ATTRIBUTES = new(); private static readonly Dictionary<string, object?> SPELLCHECK_ATTRIBUTES = new();
/// <summary> /// <summary>
@ -85,6 +88,7 @@ public partial class TranscriptionProviderDialog : MSGComponentBase, ISecretId
private string dataManuallyModel = string.Empty; private string dataManuallyModel = string.Empty;
private string dataAPIKeyStorageIssue = string.Empty; private string dataAPIKeyStorageIssue = string.Empty;
private string dataEditingPreviousInstanceName = string.Empty; private string dataEditingPreviousInstanceName = string.Empty;
private string dataLoadingModelsIssue = string.Empty;
// We get the form reference from Blazor code to validate it manually: // We get the form reference from Blazor code to validate it manually:
private MudForm form = null!; private MudForm form = null!;
@ -102,14 +106,22 @@ public partial class TranscriptionProviderDialog : MSGComponentBase, ISecretId
GetPreviousInstanceName = () => this.dataEditingPreviousInstanceName, GetPreviousInstanceName = () => this.dataEditingPreviousInstanceName,
GetUsedInstanceNames = () => this.UsedInstanceNames, GetUsedInstanceNames = () => this.UsedInstanceNames,
GetHost = () => this.DataHost, GetHost = () => this.DataHost,
IsModelProvidedManually = () => this.DataLLMProvider.IsTranscriptionModelProvidedManually(this.DataHost),
}; };
} }
private TranscriptionProvider CreateTranscriptionProviderSettings() private TranscriptionProvider CreateTranscriptionProviderSettings()
{ {
var cleanedHostname = this.DataHostname.Trim(); var cleanedHostname = this.DataHostname.Trim();
Model model = default;
if(this.DataLLMProvider is LLMProviders.SELF_HOSTED) // Determine the model based on the provider and host configuration:
Model model;
if (this.DataLLMProvider.IsTranscriptionModelSelectionHidden(this.DataHost))
{
// Use system model placeholder for hosts that don't support model selection (e.g., whisper.cpp):
model = Model.SYSTEM_MODEL;
}
else if (this.DataLLMProvider is LLMProviders.SELF_HOSTED)
{ {
switch (this.DataHost) switch (this.DataHost)
{ {
@ -119,7 +131,7 @@ public partial class TranscriptionProviderDialog : MSGComponentBase, ISecretId
case Host.VLLM: case Host.VLLM:
case Host.LM_STUDIO: case Host.LM_STUDIO:
case Host.WHISPER_CPP: default:
model = this.DataModel; model = this.DataModel;
break; break;
} }
@ -217,6 +229,15 @@ public partial class TranscriptionProviderDialog : MSGComponentBase, ISecretId
await this.form.Validate(); await this.form.Validate();
this.dataAPIKeyStorageIssue = string.Empty; this.dataAPIKeyStorageIssue = string.Empty;
// Manually validate the model selection (needed when no models are loaded
// and the MudSelect is not rendered):
var modelValidationError = this.providerValidation.ValidatingModel(this.DataModel);
if (!string.IsNullOrWhiteSpace(modelValidationError))
{
this.dataIssues = [..this.dataIssues, modelValidationError];
this.dataIsValid = false;
}
// When the data is not valid, we don't store it: // When the data is not valid, we don't store it:
if (!this.dataIsValid) if (!this.dataIsValid)
return; return;
@ -259,13 +280,26 @@ public partial class TranscriptionProviderDialog : MSGComponentBase, ISecretId
} }
} }
private void OnHostChanged(Host selectedHost)
{
// When the host changes, reset the model selection state:
this.DataHost = selectedHost;
this.DataModel = default;
this.dataManuallyModel = string.Empty;
this.availableModels.Clear();
this.dataLoadingModelsIssue = string.Empty;
}
private async Task ReloadModels() private async Task ReloadModels()
{ {
this.dataLoadingModelsIssue = string.Empty;
var currentTranscriptionProviderSettings = this.CreateTranscriptionProviderSettings(); var currentTranscriptionProviderSettings = this.CreateTranscriptionProviderSettings();
var provider = currentTranscriptionProviderSettings.CreateProvider(); var provider = currentTranscriptionProviderSettings.CreateProvider();
if (provider is NoProvider) if (provider is NoProvider)
return; return;
try
{
var models = await provider.GetTranscriptionModels(this.dataAPIKey); var models = await provider.GetTranscriptionModels(this.dataAPIKey);
// Order descending by ID means that the newest models probably come first: // Order descending by ID means that the newest models probably come first:
@ -274,6 +308,12 @@ public partial class TranscriptionProviderDialog : MSGComponentBase, ISecretId
this.availableModels.Clear(); this.availableModels.Clear();
this.availableModels.AddRange(orderedModels); this.availableModels.AddRange(orderedModels);
} }
catch (Exception e)
{
this.Logger.LogError($"Failed to load models from provider '{this.DataLLMProvider}' (host={this.DataHost}, hostname='{this.DataHostname}'): {e.Message}");;
this.dataLoadingModelsIssue = T("We are currently unable to communicate with the provider to load models. Please try again later.");
}
}
private string APIKeyText => this.DataLLMProvider switch private string APIKeyText => this.DataLLMProvider switch
{ {

View File

@ -2103,6 +2103,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T14695
-- Add Embedding -- Add Embedding
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1738753945"] = "Einbettung hinzufügen" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1738753945"] = "Einbettung hinzufügen"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1760715963"] = "Verwendet das vom Anbieter konfigurierte Modell"
-- Are you sure you want to delete the embedding provider '{0}'? -- Are you sure you want to delete the embedding provider '{0}'?
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1825371968"] = "Sind Sie sicher, dass Sie den Einbettungsanbieter '{0}' löschen möchten?" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1825371968"] = "Sind Sie sicher, dass Sie den Einbettungsanbieter '{0}' löschen möchten?"
@ -2166,11 +2169,14 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T162847
-- Description -- Description
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1725856265"] = "Beschreibung" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1725856265"] = "Beschreibung"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1760715963"] = "Verwendet das vom Anbieter konfigurierte Modell"
-- Add Provider -- Add Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1806589097"] = "Anbieter hinzufügen" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1806589097"] = "Anbieter hinzufügen"
-- Configure LLM Providers -- Configure LLM Providers
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1810190350"] = "Anbieter für LLM konfigurieren" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1810190350"] = "Anbieter für LLMs konfigurieren"
-- Edit LLM Provider -- Edit LLM Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1868766523"] = "LLM-Anbieter bearbeiten" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1868766523"] = "LLM-Anbieter bearbeiten"
@ -2206,10 +2212,7 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T284206
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T2911731076"] = "Noch keine Anbieter konfiguriert." UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T2911731076"] = "Noch keine Anbieter konfiguriert."
-- Configured LLM Providers -- Configured LLM Providers
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3019870540"] = "Konfigurierte Anbieter für LLM" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3019870540"] = "Konfigurierte Anbieter für LLMs"
-- as selected by provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3082210376"] = "wie vom Anbieter ausgewählt"
-- Edit -- Edit
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3267849393"] = "Bearbeiten" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3267849393"] = "Bearbeiten"
@ -2268,6 +2271,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T14
-- Add transcription provider -- Add transcription provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1645238629"] = "Anbieter für Transkriptionen hinzufügen" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1645238629"] = "Anbieter für Transkriptionen hinzufügen"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1760715963"] = "Verwendet das vom Anbieter konfigurierte Modell"
-- Add Transcription Provider -- Add Transcription Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T2066315685"] = "Anbieter für Transkriptionen hinzufügen" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T2066315685"] = "Anbieter für Transkriptionen hinzufügen"
@ -2292,6 +2298,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T40
-- Configured Transcription Providers -- Configured Transcription Providers
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T4210863523"] = "Konfigurierte Anbieter für Transkriptionen" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T4210863523"] = "Konfigurierte Anbieter für Transkriptionen"
-- With the support of transcription models, MindWork AI Studio can convert human speech into text. This is useful, for example, when you need to dictate text. You can choose from dedicated transcription models, but not multimodal LLMs (large language models) that can handle both speech and text. The configuration of multimodal models is done in the 'Configure LLM providers' section.
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T584860404"] = "Mit Unterstützung von Modellen für Transkriptionen kann MindWork AI Studio menschliche Sprache in Text umwandeln. Das ist zum Beispiel hilfreich, wenn Sie Texte diktieren möchten. Sie können aus speziellen Modellen für Transkriptionen wählen, jedoch nicht aus multimodalen LLMs (Large Language Models), die sowohl Sprache als auch Text verarbeiten können. Die Einrichtung multimodaler Modelle erfolgt im Abschnitt „Anbieter für LLMs konfigurieren“."
-- This transcription provider is managed by your organization. -- This transcription provider is managed by your organization.
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T756131076"] = "Dieser Anbieter für Transkriptionen wird von Ihrer Organisation verwaltet." UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T756131076"] = "Dieser Anbieter für Transkriptionen wird von Ihrer Organisation verwaltet."
@ -2301,9 +2310,6 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T78
-- Are you sure you want to delete the transcription provider '{0}'? -- Are you sure you want to delete the transcription provider '{0}'?
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T789660305"] = "Möchten Sie den Anbieter für Transkriptionen „{0}“ wirklich löschen?" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T789660305"] = "Möchten Sie den Anbieter für Transkriptionen „{0}“ wirklich löschen?"
-- With the support of transcription models, MindWork AI Studio can convert human speech into text. This is useful, for example, when you need to dictate text. You can choose from dedicated transcription models, but not multimodal LLMs (large language models) that can handle both speech and text. The configuration of multimodal models is done in the \"Configure providers\" section.
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T584860404"] = "Mit Unterstützung von Modellen für Transkriptionen kann MindWork AI Studio menschliche Sprache in Text umwandeln. Das ist zum Beispiel hilfreich, wenn Sie Texte diktieren möchten. Sie können aus speziellen Modellen für Transkriptionen wählen, jedoch nicht aus multimodalen LLMs (Large Language Models), die sowohl Sprache als auch Text verarbeiten können. Die Einrichtung multimodaler Modelle erfolgt im Abschnitt „Anbieter für LLM konfigurieren“."
-- Provider -- Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T900237532"] = "Anbieter" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T900237532"] = "Anbieter"
@ -3207,6 +3213,9 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T290547799"] = "Der
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T416738168"] = "Modellauswahl" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T416738168"] = "Modellauswahl"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T504465522"] = "Wir können derzeit nicht mit dem Anbieter kommunizieren, um Modelle zu laden. Bitte versuchen Sie es später erneut."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T808120719"] = "Host"
@ -3414,12 +3423,18 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3361153305"] = "Experten-Ei
-- Show available models -- Show available models
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3763891899"] = "Verfügbare Modelle anzeigen" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3763891899"] = "Verfügbare Modelle anzeigen"
-- This host uses the model configured at the provider level. No model selection is available.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3783329915"] = "Dieser Host verwendet das auf Anbieterebene konfigurierte Modell. Es ist keine Modellauswahl verfügbar."
-- Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually. -- Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T4116737656"] = "Derzeit können wir die Modelle für den ausgewählten Anbieter und/oder Host nicht abfragen. Bitte geben Sie daher den Modellnamen manuell ein." UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T4116737656"] = "Derzeit können wir die Modelle für den ausgewählten Anbieter und/oder Host nicht abfragen. Bitte geben Sie daher den Modellnamen manuell ein."
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T416738168"] = "Modellauswahl" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T416738168"] = "Modellauswahl"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T504465522"] = "Wir können derzeit nicht mit dem Anbieter kommunizieren, um Modelle zu laden. Bitte versuchen Sie es später erneut."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T808120719"] = "Host"
@ -4635,9 +4650,15 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T2842060373"] =
-- Please enter a transcription model name. -- Please enter a transcription model name.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3703662664"] = "Bitte geben Sie den Namen eines Transkriptionsmodells ein." UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3703662664"] = "Bitte geben Sie den Namen eines Transkriptionsmodells ein."
-- This host uses the model configured at the provider level. No model selection is available.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3783329915"] = "Dieser Host verwendet das auf Anbieterebene konfigurierte Modell. Eine Modellauswahl ist nicht verfügbar."
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T416738168"] = "Modellauswahl" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T416738168"] = "Modellauswahl"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T504465522"] = "Wir können derzeit nicht mit dem Anbieter kommunizieren, um Modelle zu laden. Bitte versuchen Sie es später erneut."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T808120719"] = "Host"

View File

@ -2103,6 +2103,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T14695
-- Add Embedding -- Add Embedding
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1738753945"] = "Add Embedding" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1738753945"] = "Add Embedding"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1760715963"] = "Uses the provider-configured model"
-- Are you sure you want to delete the embedding provider '{0}'? -- Are you sure you want to delete the embedding provider '{0}'?
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1825371968"] = "Are you sure you want to delete the embedding provider '{0}'?" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELEMBEDDINGS::T1825371968"] = "Are you sure you want to delete the embedding provider '{0}'?"
@ -2166,6 +2169,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T162847
-- Description -- Description
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1725856265"] = "Description" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1725856265"] = "Description"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1760715963"] = "Uses the provider-configured model"
-- Add Provider -- Add Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1806589097"] = "Add Provider" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T1806589097"] = "Add Provider"
@ -2208,9 +2214,6 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T291173
-- Configured LLM Providers -- Configured LLM Providers
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3019870540"] = "Configured LLM Providers" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3019870540"] = "Configured LLM Providers"
-- as selected by provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3082210376"] = "as selected by provider"
-- Edit -- Edit
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3267849393"] = "Edit" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELPROVIDERS::T3267849393"] = "Edit"
@ -2268,6 +2271,9 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T14
-- Add transcription provider -- Add transcription provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1645238629"] = "Add transcription provider" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1645238629"] = "Add transcription provider"
-- Uses the provider-configured model
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T1760715963"] = "Uses the provider-configured model"
-- Add Transcription Provider -- Add Transcription Provider
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T2066315685"] = "Add Transcription Provider" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T2066315685"] = "Add Transcription Provider"
@ -2292,7 +2298,7 @@ UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T40
-- Configured Transcription Providers -- Configured Transcription Providers
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T4210863523"] = "Configured Transcription Providers" UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T4210863523"] = "Configured Transcription Providers"
-- With the support of transcription models, MindWork AI Studio can convert human speech into text. This is useful, for example, when you need to dictate text. You can choose from dedicated transcription models, but not multimodal LLMs (large language models) that can handle both speech and text. The configuration of multimodal models is done in the 'Configure providers' section. -- With the support of transcription models, MindWork AI Studio can convert human speech into text. This is useful, for example, when you need to dictate text. You can choose from dedicated transcription models, but not multimodal LLMs (large language models) that can handle both speech and text. The configuration of multimodal models is done in the 'Configure LLM providers' section.
UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T584860404"] = "With the support of transcription models, MindWork AI Studio can convert human speech into text. This is useful, for example, when you need to dictate text. You can choose from dedicated transcription models, but not multimodal LLMs (large language models) that can handle both speech and text. The configuration of multimodal models is done in the 'Configure LLM providers' section." UI_TEXT_CONTENT["AISTUDIO::COMPONENTS::SETTINGS::SETTINGSPANELTRANSCRIPTION::T584860404"] = "With the support of transcription models, MindWork AI Studio can convert human speech into text. This is useful, for example, when you need to dictate text. You can choose from dedicated transcription models, but not multimodal LLMs (large language models) that can handle both speech and text. The configuration of multimodal models is done in the 'Configure LLM providers' section."
-- This transcription provider is managed by your organization. -- This transcription provider is managed by your organization.
@ -3207,6 +3213,9 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T290547799"] = "Cur
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T416738168"] = "Model selection" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T416738168"] = "Model selection"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T504465522"] = "We are currently unable to communicate with the provider to load models. Please try again later."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::EMBEDDINGPROVIDERDIALOG::T808120719"] = "Host"
@ -3414,12 +3423,18 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3361153305"] = "Show Expert
-- Show available models -- Show available models
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3763891899"] = "Show available models" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3763891899"] = "Show available models"
-- This host uses the model configured at the provider level. No model selection is available.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T3783329915"] = "This host uses the model configured at the provider level. No model selection is available."
-- Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually. -- Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T4116737656"] = "Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually." UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T4116737656"] = "Currently, we cannot query the models for the selected provider and/or host. Therefore, please enter the model name manually."
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T416738168"] = "Model selection" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T416738168"] = "Model selection"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T504465522"] = "We are currently unable to communicate with the provider to load models. Please try again later."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::PROVIDERDIALOG::T808120719"] = "Host"
@ -4635,9 +4650,15 @@ UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T2842060373"] =
-- Please enter a transcription model name. -- Please enter a transcription model name.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3703662664"] = "Please enter a transcription model name." UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3703662664"] = "Please enter a transcription model name."
-- This host uses the model configured at the provider level. No model selection is available.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T3783329915"] = "This host uses the model configured at the provider level. No model selection is available."
-- Model selection -- Model selection
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T416738168"] = "Model selection" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T416738168"] = "Model selection"
-- We are currently unable to communicate with the provider to load models. Please try again later.
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T504465522"] = "We are currently unable to communicate with the provider to load models. Please try again later."
-- Host -- Host
UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T808120719"] = "Host" UI_TEXT_CONTENT["AISTUDIO::DIALOGS::TRANSCRIPTIONPROVIDERDIALOG::T808120719"] = "Host"

View File

@ -554,10 +554,22 @@ public abstract class BaseProvider : IProvider, ISecretId
await using var fileStream = File.OpenRead(audioFilePath); await using var fileStream = File.OpenRead(audioFilePath);
using var fileContent = new StreamContent(fileStream); using var fileContent = new StreamContent(fileStream);
// Set the content type based on the file extension:
fileContent.Headers.ContentType = new MediaTypeHeaderValue(mimeType); fileContent.Headers.ContentType = new MediaTypeHeaderValue(mimeType);
// Add the file content to the form data:
form.Add(fileContent, "file", Path.GetFileName(audioFilePath)); form.Add(fileContent, "file", Path.GetFileName(audioFilePath));
form.Add(new StringContent(transcriptionModel.Id), "model");
//
// Add the model name to the form data. Ensure that a model name is always provided.
// Otherwise, the StringContent constructor will throw an exception.
//
var modelName = transcriptionModel.Id;
if (string.IsNullOrWhiteSpace(modelName))
modelName = "placeholder";
form.Add(new StringContent(modelName), "model");
using var request = new HttpRequestMessage(HttpMethod.Post, host.TranscriptionURL()); using var request = new HttpRequestMessage(HttpMethod.Post, host.TranscriptionURL());
request.Content = form; request.Content = form;

View File

@ -327,6 +327,32 @@ public static class LLMProvidersExtensions
_ => false, _ => false,
}; };
/// <summary>
/// Determines if the model selection should be completely hidden for LLM providers.
/// This is the case when the host does not support model selection (e.g., llama.cpp).
/// </summary>
/// <param name="provider">The provider.</param>
/// <param name="host">The host for self-hosted providers.</param>
/// <returns>True if model selection should be hidden; otherwise, false.</returns>
public static bool IsLLMModelSelectionHidden(this LLMProviders provider, Host host) => provider switch
{
LLMProviders.SELF_HOSTED => host is Host.LLAMA_CPP,
_ => false,
};
/// <summary>
/// Determines if the model selection should be completely hidden for transcription providers.
/// This is the case when the host does not support model selection (e.g., whisper.cpp).
/// </summary>
/// <param name="provider">The provider.</param>
/// <param name="host">The host for self-hosted providers.</param>
/// <returns>True if model selection should be hidden; otherwise, false.</returns>
public static bool IsTranscriptionModelSelectionHidden(this LLMProviders provider, Host host) => provider switch
{
LLMProviders.SELF_HOSTED => host is Host.WHISPER_CPP,
_ => false,
};
public static bool IsHostNeeded(this LLMProviders provider) => provider switch public static bool IsHostNeeded(this LLMProviders provider) => provider switch
{ {
LLMProviders.SELF_HOSTED => true, LLMProviders.SELF_HOSTED => true,
@ -391,13 +417,13 @@ public static class LLMProvidersExtensions
{ {
case Host.NONE: case Host.NONE:
case Host.LLAMA_CPP: case Host.LLAMA_CPP:
case Host.WHISPER_CPP:
default: default:
return false; return false;
case Host.OLLAMA: case Host.OLLAMA:
case Host.LM_STUDIO: case Host.LM_STUDIO:
case Host.VLLM: case Host.VLLM:
case Host.WHISPER_CPP:
return true; return true;
} }
} }

View File

@ -9,6 +9,22 @@ namespace AIStudio.Provider;
/// <param name="DisplayName">The model's display name.</param> /// <param name="DisplayName">The model's display name.</param>
public readonly record struct Model(string Id, string? DisplayName) public readonly record struct Model(string Id, string? DisplayName)
{ {
/// <summary>
/// Special model ID used when the model is selected by the system/host
/// and cannot be changed by the user (e.g., llama.cpp, whisper.cpp).
/// </summary>
private const string SYSTEM_MODEL_ID = "::system::";
/// <summary>
/// Creates a system-configured model placeholder.
/// </summary>
public static readonly Model SYSTEM_MODEL = new(SYSTEM_MODEL_ID, null);
/// <summary>
/// Checks if this model is the system-configured placeholder.
/// </summary>
public bool IsSystemModel => this == SYSTEM_MODEL;
private static string TB(string fallbackEN) => I18N.I.T(fallbackEN, typeof(Model).Namespace, nameof(Model)); private static string TB(string fallbackEN) => I18N.I.T(fallbackEN, typeof(Model).Namespace, nameof(Model));
#region Overrides of ValueType #region Overrides of ValueType

View File

@ -149,31 +149,30 @@ public sealed class ProviderSelfHosted(Host host, string hostname) : BaseProvide
} }
/// <inheritdoc /> /// <inheritdoc />
public override Task<IEnumerable<Provider.Model>> GetTranscriptionModels(string? apiKeyProvisional = null, CancellationToken token = default) public override async Task<IEnumerable<Provider.Model>> GetTranscriptionModels(string? apiKeyProvisional = null, CancellationToken token = default)
{ {
try try
{ {
switch (host) switch (host)
{ {
case Host.WHISPER_CPP: case Host.WHISPER_CPP:
return Task.FromResult<IEnumerable<Provider.Model>>( return new List<Provider.Model>
new List<Provider.Model>
{ {
new("loaded-model", TB("Model as configured by whisper.cpp")), new("loaded-model", TB("Model as configured by whisper.cpp")),
}); };
case Host.OLLAMA: case Host.OLLAMA:
case Host.VLLM: case Host.VLLM:
return this.LoadModels(SecretStoreType.TRANSCRIPTION_PROVIDER, [], [], token, apiKeyProvisional); return await this.LoadModels(SecretStoreType.TRANSCRIPTION_PROVIDER, [], [], token, apiKeyProvisional);
default: default:
return Task.FromResult(Enumerable.Empty<Provider.Model>()); return [];
} }
} }
catch (Exception e) catch (Exception e)
{ {
LOGGER.LogError(e, "Failed to load transcription models from self-hosted provider."); LOGGER.LogError($"Failed to load transcription models from self-hosted provider: {e.Message}");
return Task.FromResult(Enumerable.Empty<Provider.Model>()); return [];
} }
} }

View File

@ -20,6 +20,8 @@ public sealed class ProviderValidation
public Func<Host> GetHost { get; init; } = () => Host.NONE; public Func<Host> GetHost { get; init; } = () => Host.NONE;
public Func<bool> IsModelProvidedManually { get; init; } = () => false;
public string? ValidatingHostname(string hostname) public string? ValidatingHostname(string hostname)
{ {
if(this.GetProvider() != LLMProviders.SELF_HOSTED) if(this.GetProvider() != LLMProviders.SELF_HOSTED)
@ -70,7 +72,17 @@ public sealed class ProviderValidation
public string? ValidatingModel(Model model) public string? ValidatingModel(Model model)
{ {
if(this.GetProvider() is LLMProviders.SELF_HOSTED && this.GetHost() == Host.LLAMA_CPP) // For NONE providers, no validation is needed:
if (this.GetProvider() is LLMProviders.NONE)
return null;
// For self-hosted llama.cpp or whisper.cpp, no model selection needed
// (model is loaded at startup):
if (this.GetProvider() is LLMProviders.SELF_HOSTED && this.GetHost() is Host.LLAMA_CPP or Host.WHISPER_CPP)
return null;
// For manually entered models, this validation doesn't apply:
if (this.IsModelProvidedManually())
return null; return null;
if (model == default) if (model == default)

View File

@ -1,5 +1,10 @@
# v26.1.2, build 232 (2026-01-xx xx:xx UTC) # v26.1.2, build 232 (2026-01-xx xx:xx UTC)
- Added the option to hide specific assistants by configuration plugins. This is useful for enterprise environments in organizations. - Added the option to hide specific assistants by configuration plugins. This is useful for enterprise environments in organizations.
- Improved error handling for model loading in provider dialogs (LLMs, embeddings, transcriptions).
- Improved the microphone handling (transcription preview) so that all sound effects and the voice recording are processed without interruption. - Improved the microphone handling (transcription preview) so that all sound effects and the voice recording are processed without interruption.
- Improved the handling of self-hosted providers in the configuration dialogs (LLMs, embeddings, and transcriptions) when the host cannot provide a list of models.
- Fixed a logging bug that prevented log events from being recorded in some cases. - Fixed a logging bug that prevented log events from being recorded in some cases.
- Fixed a bug that allowed adding a provider (LLM, embedding, or transcription) without selecting a model.
- Fixed a bug with local transcription providers by handling errors correctly when the local provider is unavailable.
- Fixed a bug with local transcription providers by correctly handling empty model IDs.
- Fixed a bug affecting the transcription preview: previously, when you stopped music or other media, recorded or dictated text, and then tried to resume playback, the media wouldnt resume as expected. This behavior is now fixed. - Fixed a bug affecting the transcription preview: previously, when you stopped music or other media, recorded or dictated text, and then tried to resume playback, the media wouldnt resume as expected. This behavior is now fixed.