Added support for Perplexity (#545)
Some checks are pending
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-apple-darwin, osx-x64, macos-latest, x86_64-apple-darwin, dmg updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-pc-windows-msvc.exe, win-x64, windows-latest, x86_64-pc-windows-msvc, nsis updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-unknown-linux-gnu, linux-x64, ubuntu-22.04, x86_64-unknown-linux-gnu, appimage deb updater) (push) Blocked by required conditions
Build and Release / Prepare & create release (push) Blocked by required conditions
Build and Release / Read metadata (push) Waiting to run
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-apple-darwin, osx-arm64, macos-latest, aarch64-apple-darwin, dmg updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-pc-windows-msvc.exe, win-arm64, windows-latest, aarch64-pc-windows-msvc, nsis updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-unknown-linux-gnu, linux-arm64, ubuntu-22.04-arm, aarch64-unknown-linux-gnu, appimage deb updater) (push) Blocked by required conditions
Build and Release / Publish release (push) Blocked by required conditions

Co-authored-by: Thorsten Sommer <mail@tsommer.org>
This commit is contained in:
Peer Schütt 2025-08-31 14:27:35 +02:00 committed by GitHub
parent adf61a1384
commit 2a01d3bb6e
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
39 changed files with 398 additions and 39 deletions

View File

@ -1333,6 +1333,9 @@ UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1603883875"] = "Yes, re
-- Yes, remove it -- Yes, remove it
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1820166585"] = "Yes, remove it" UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1820166585"] = "Yes, remove it"
-- Number of sources
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1848978959"] = "Number of sources"
-- Do you really want to edit this message? In order to edit this message, the AI response will be deleted. -- Do you really want to edit this message? In order to edit this message, the AI response will be deleted.
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T2018431076"] = "Do you really want to edit this message? In order to edit this message, the AI response will be deleted." UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T2018431076"] = "Do you really want to edit this message? In order to edit this message, the AI response will be deleted."
@ -4615,6 +4618,9 @@ UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1702902297"] = "Introduction"
-- Vision -- Vision
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1892426825"] = "Vision" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1892426825"] = "Vision"
-- You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Perplexity, Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2183503084"] = "You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Perplexity, Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities."
-- Let's get started -- Let's get started
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2331588413"] = "Let's get started" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2331588413"] = "Let's get started"
@ -4624,9 +4630,6 @@ UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2348849647"] = "Last Changelog"
-- Choose the provider and model best suited for your current task. -- Choose the provider and model best suited for your current task.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2588488920"] = "Choose the provider and model best suited for your current task." UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2588488920"] = "Choose the provider and model best suited for your current task."
-- You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT4o, o1, etc.), Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2900280782"] = "You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT4o, o1, etc.), Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities."
-- Quick Start Guide -- Quick Start Guide
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T3002014720"] = "Quick Start Guide" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T3002014720"] = "Quick Start Guide"
@ -4861,6 +4864,9 @@ UI_TEXT_CONTENT["AISTUDIO::PROVIDER::LLMPROVIDERSEXTENSIONS::T3424652889"] = "Un
-- no model selected -- no model selected
UI_TEXT_CONTENT["AISTUDIO::PROVIDER::MODEL::T2234274832"] = "no model selected" UI_TEXT_CONTENT["AISTUDIO::PROVIDER::MODEL::T2234274832"] = "no model selected"
-- Sources
UI_TEXT_CONTENT["AISTUDIO::PROVIDER::SOURCEEXTENSIONS::T2730980305"] = "Sources"
-- Use no chat template -- Use no chat template
UI_TEXT_CONTENT["AISTUDIO::SETTINGS::CHATTEMPLATE::T4258819635"] = "Use no chat template" UI_TEXT_CONTENT["AISTUDIO::SETTINGS::CHATTEMPLATE::T4258819635"] = "Use no chat template"

View File

@ -1,6 +1,7 @@
@using AIStudio.Tools @using AIStudio.Tools
@using MudBlazor @using MudBlazor
@using AIStudio.Components @using AIStudio.Components
@using AIStudio.Provider
@inherits AIStudio.Components.MSGComponentBase @inherits AIStudio.Components.MSGComponentBase
<MudCard Class="@this.CardClasses" Outlined="@true"> <MudCard Class="@this.CardClasses" Outlined="@true">
<MudCardHeader> <MudCardHeader>
@ -15,6 +16,14 @@
</MudText> </MudText>
</CardHeaderContent> </CardHeaderContent>
<CardHeaderActions> <CardHeaderActions>
@if (this.Content.Sources.Count > 0)
{
<MudTooltip Text="@T("Number of sources")" Placement="Placement.Bottom">
<MudBadge Content="@this.Content.Sources.Count" Color="Color.Primary" Overlap="true" BadgeClass="sources-card-header">
<MudIconButton Icon="@Icons.Material.Filled.Link" />
</MudBadge>
</MudTooltip>
}
@if (this.IsSecondToLastBlock && this.Role is ChatRole.USER && this.EditLastUserBlockFunc is not null) @if (this.IsSecondToLastBlock && this.Role is ChatRole.USER && this.EditLastUserBlockFunc is not null)
{ {
<MudTooltip Text="@T("Edit")" Placement="Placement.Bottom"> <MudTooltip Text="@T("Edit")" Placement="Placement.Bottom">
@ -72,6 +81,10 @@
else else
{ {
<MudMarkdown Value="@textContent.Text.RemoveThinkTags().Trim()" Props="Markdown.DefaultConfig" Styling="@this.MarkdownStyling" /> <MudMarkdown Value="@textContent.Text.RemoveThinkTags().Trim()" Props="Markdown.DefaultConfig" Styling="@this.MarkdownStyling" />
@if (textContent.Sources.Count > 0)
{
<MudMarkdown Value="@textContent.Sources.ToMarkdown()" Props="Markdown.DefaultConfig" Styling="@this.MarkdownStyling" />
}
} }
} }
} }

View File

@ -27,6 +27,9 @@ public sealed class ContentImage : IContent, IImageSource
[JsonIgnore] [JsonIgnore]
public Func<Task> StreamingEvent { get; set; } = () => Task.CompletedTask; public Func<Task> StreamingEvent { get; set; } = () => Task.CompletedTask;
/// <inheritdoc />
public List<Source> Sources { get; set; } = [];
/// <inheritdoc /> /// <inheritdoc />
public Task<ChatThread> CreateFromProviderAsync(IProvider provider, Model chatModel, IContent? lastPrompt, ChatThread? chatChatThread, CancellationToken token = default) public Task<ChatThread> CreateFromProviderAsync(IProvider provider, Model chatModel, IContent? lastPrompt, ChatThread? chatChatThread, CancellationToken token = default)
{ {

View File

@ -24,7 +24,7 @@ public sealed class ContentText : IContent
public bool InitialRemoteWait { get; set; } public bool InitialRemoteWait { get; set; }
/// <inheritdoc /> /// <inheritdoc />
// [JsonIgnore] [JsonIgnore]
public bool IsStreaming { get; set; } public bool IsStreaming { get; set; }
/// <inheritdoc /> /// <inheritdoc />
@ -35,6 +35,9 @@ public sealed class ContentText : IContent
[JsonIgnore] [JsonIgnore]
public Func<Task> StreamingEvent { get; set; } = () => Task.CompletedTask; public Func<Task> StreamingEvent { get; set; } = () => Task.CompletedTask;
/// <inheritdoc />
public List<Source> Sources { get; set; } = [];
/// <inheritdoc /> /// <inheritdoc />
public async Task<ChatThread> CreateFromProviderAsync(IProvider provider, Model chatModel, IContent? lastPrompt, ChatThread? chatThread, CancellationToken token = default) public async Task<ChatThread> CreateFromProviderAsync(IProvider provider, Model chatModel, IContent? lastPrompt, ChatThread? chatThread, CancellationToken token = default)
{ {
@ -80,7 +83,7 @@ public sealed class ContentText : IContent
this.InitialRemoteWait = true; this.InitialRemoteWait = true;
// Iterate over the responses from the AI: // Iterate over the responses from the AI:
await foreach (var deltaText in provider.StreamChatCompletion(chatModel, chatThread, settings, token)) await foreach (var contentStreamChunk in provider.StreamChatCompletion(chatModel, chatThread, settings, token))
{ {
// When the user cancels the request, we stop the loop: // When the user cancels the request, we stop the loop:
if (token.IsCancellationRequested) if (token.IsCancellationRequested)
@ -91,7 +94,10 @@ public sealed class ContentText : IContent
this.IsStreaming = true; this.IsStreaming = true;
// Add the response to the text: // Add the response to the text:
this.Text += deltaText; this.Text += contentStreamChunk;
// Merge the sources:
this.Sources.MergeSources(contentStreamChunk.Sources);
// Notify the UI that the content has changed, // Notify the UI that the content has changed,
// depending on the energy saving mode: // depending on the energy saving mode:

View File

@ -37,6 +37,12 @@ public interface IContent
/// </summary> /// </summary>
[JsonIgnore] [JsonIgnore]
public Func<Task> StreamingDone { get; set; } public Func<Task> StreamingDone { get; set; }
/// <summary>
/// The provided sources, if any.
/// </summary>
[JsonIgnore]
public List<Source> Sources { get; set; }
/// <summary> /// <summary>
/// Uses the provider to create the content. /// Uses the provider to create the content.

View File

@ -126,12 +126,13 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
Id = this.DataId, Id = this.DataId,
InstanceName = this.DataInstanceName, InstanceName = this.DataInstanceName,
UsedLLMProvider = this.DataLLMProvider, UsedLLMProvider = this.DataLLMProvider,
Model = this.DataLLMProvider switch Model = this.DataLLMProvider switch
{ {
LLMProviders.FIREWORKS => new Model(this.dataManuallyModel, null), LLMProviders.FIREWORKS or LLMProviders.HUGGINGFACE => new Model(this.dataManuallyModel, null),
LLMProviders.HUGGINGFACE => new Model(this.dataManuallyModel, null),
_ => this.DataModel _ => this.DataModel
}, },
IsSelfHosted = this.DataLLMProvider is LLMProviders.SELF_HOSTED, IsSelfHosted = this.DataLLMProvider is LLMProviders.SELF_HOSTED,
IsEnterpriseConfiguration = false, IsEnterpriseConfiguration = false,
Hostname = cleanedHostname.EndsWith('/') ? cleanedHostname[..^1] : cleanedHostname, Hostname = cleanedHostname.EndsWith('/') ? cleanedHostname[..^1] : cleanedHostname,
@ -158,7 +159,7 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
this.dataEditingPreviousInstanceName = this.DataInstanceName.ToLowerInvariant(); this.dataEditingPreviousInstanceName = this.DataInstanceName.ToLowerInvariant();
// When using Fireworks or Hugging Face, we must copy the model name: // When using Fireworks or Hugging Face, we must copy the model name:
if (this.DataLLMProvider is LLMProviders.FIREWORKS or LLMProviders.HUGGINGFACE) if (this.DataLLMProvider.IsLLMModelProvidedManually())
this.dataManuallyModel = this.DataModel.Id; this.dataManuallyModel = this.DataModel.Id;
// //
@ -241,7 +242,7 @@ public partial class ProviderDialog : MSGComponentBase, ISecretId
private string? ValidateManuallyModel(string manuallyModel) private string? ValidateManuallyModel(string manuallyModel)
{ {
if ((this.DataLLMProvider is LLMProviders.FIREWORKS or LLMProviders.HUGGINGFACE) && string.IsNullOrWhiteSpace(manuallyModel)) if (this.DataLLMProvider.IsLLMModelProvidedManually() && string.IsNullOrWhiteSpace(manuallyModel))
return T("Please enter a model name."); return T("Please enter a model name.");
return null; return null;

View File

@ -31,7 +31,7 @@ public partial class Home : MSGComponentBase
{ {
this.itemsAdvantages = [ this.itemsAdvantages = [
new(this.T("Free of charge"), this.T("The app is free to use, both for personal and commercial purposes.")), new(this.T("Free of charge"), this.T("The app is free to use, both for personal and commercial purposes.")),
new(this.T("Independence"), this.T("You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT4o, o1, etc.), Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.")), new(this.T("Independence"), this.T("You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Perplexity, Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.")),
new(this.T("Assistants"), this.T("You just want to quickly translate a text? AI Studio has so-called assistants for such and other tasks. No prompting is necessary when working with these assistants.")), new(this.T("Assistants"), this.T("You just want to quickly translate a text? AI Studio has so-called assistants for such and other tasks. No prompting is necessary when working with these assistants.")),
new(this.T("Unrestricted usage"), this.T("Unlike services like ChatGPT, which impose limits after intensive use, MindWork AI Studio offers unlimited usage through the providers API.")), new(this.T("Unrestricted usage"), this.T("Unlike services like ChatGPT, which impose limits after intensive use, MindWork AI Studio offers unlimited usage through the providers API.")),
new(this.T("Cost-effective"), this.T("You only pay for what you use, which can be cheaper than monthly subscription services like ChatGPT Plus, especially if used infrequently. But beware, here be dragons: For extremely intensive usage, the API costs can be significantly higher. Unfortunately, providers currently do not offer a way to display current costs in the app. Therefore, check your account with the respective provider to see how your costs are developing. When available, use prepaid and set a cost limit.")), new(this.T("Cost-effective"), this.T("You only pay for what you use, which can be cheaper than monthly subscription services like ChatGPT Plus, especially if used infrequently. But beware, here be dragons: For extremely intensive usage, the API costs can be significantly higher. Unfortunately, providers currently do not offer a way to display current costs in the app. Therefore, check your account with the respective provider to see how your costs are developing. When available, use prepaid and set a cost limit.")),

View File

@ -1335,6 +1335,9 @@ UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1603883875"] = "Ja, neu
-- Yes, remove it -- Yes, remove it
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1820166585"] = "Ja, entferne es" UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1820166585"] = "Ja, entferne es"
-- Number of sources
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1848978959"] = "Anzahl der Quellen"
-- Do you really want to edit this message? In order to edit this message, the AI response will be deleted. -- Do you really want to edit this message? In order to edit this message, the AI response will be deleted.
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T2018431076"] = "Möchten Sie diese Nachricht wirklich bearbeiten? Um die Nachricht zu bearbeiten, wird die Antwort der KI gelöscht." UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T2018431076"] = "Möchten Sie diese Nachricht wirklich bearbeiten? Um die Nachricht zu bearbeiten, wird die Antwort der KI gelöscht."
@ -4617,6 +4620,9 @@ UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1702902297"] = "Einführung"
-- Vision -- Vision
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1892426825"] = "Vision" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1892426825"] = "Vision"
-- You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Perplexity, Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2183503084"] = "Sie sind an keinen einzelnen Anbieter gebunden. Stattdessen können Sie den Anbieter wählen, der am besten zu ihren Bedürfnissen passt. Derzeit unterstützen wir OpenAI (GPT5, o1, etc.), Perplexity, Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face und selbst gehostete Modelle mit vLLM, llama.cpp, ollama, LM Studio, Groq oder Fireworks. Für Wissenschaftler und Mitarbeiter von Forschungseinrichtungen unterstützen wir auch die KI-Dienste von Helmholtz und GWDG. Diese sind über föderierte Anmeldungen wie eduGAIN für alle 18 Helmholtz-Zentren, die Max-Planck-Gesellschaft, die meisten deutschen und viele internationale Universitäten verfügbar."
-- Let's get started -- Let's get started
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2331588413"] = "Los geht's" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2331588413"] = "Los geht's"
@ -4626,9 +4632,6 @@ UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2348849647"] = "Letztes Änderungsproto
-- Choose the provider and model best suited for your current task. -- Choose the provider and model best suited for your current task.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2588488920"] = "Wählen Sie den Anbieter und das Modell aus, die am besten zu ihrer aktuellen Aufgabe passen." UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2588488920"] = "Wählen Sie den Anbieter und das Modell aus, die am besten zu ihrer aktuellen Aufgabe passen."
-- You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2900280782"] = "Sie sind an keinen einzelnen Anbieter gebunden. Stattdessen können Sie den Anbieter wählen, der am besten zu ihren Bedürfnissen passt. Derzeit unterstützen wir OpenAI (GPT5, o1, etc.), Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face und selbst gehostete Modelle mit vLLM, llama.cpp, ollama, LM Studio, Groq oder Fireworks. Für Wissenschaftler und Mitarbeiter von Forschungseinrichtungen unterstützen wir auch die KI-Dienste von Helmholtz und GWDG. Diese sind über föderierte Anmeldungen wie eduGAIN für alle 18 Helmholtz-Zentren, die Max-Planck-Gesellschaft, die meisten deutschen und viele internationale Universitäten verfügbar."
-- Quick Start Guide -- Quick Start Guide
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T3002014720"] = "Schnellstart-Anleitung" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T3002014720"] = "Schnellstart-Anleitung"
@ -4863,6 +4866,9 @@ UI_TEXT_CONTENT["AISTUDIO::PROVIDER::LLMPROVIDERSEXTENSIONS::T3424652889"] = "Un
-- no model selected -- no model selected
UI_TEXT_CONTENT["AISTUDIO::PROVIDER::MODEL::T2234274832"] = "Kein Modell ausgewählt" UI_TEXT_CONTENT["AISTUDIO::PROVIDER::MODEL::T2234274832"] = "Kein Modell ausgewählt"
-- Sources
UI_TEXT_CONTENT["AISTUDIO::PROVIDER::SOURCEEXTENSIONS::T2730980305"] = "Quellen"
-- Use no chat template -- Use no chat template
UI_TEXT_CONTENT["AISTUDIO::SETTINGS::CHATTEMPLATE::T4258819635"] = "Keine Chat-Vorlage verwenden" UI_TEXT_CONTENT["AISTUDIO::SETTINGS::CHATTEMPLATE::T4258819635"] = "Keine Chat-Vorlage verwenden"

View File

@ -1335,6 +1335,9 @@ UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1603883875"] = "Yes, re
-- Yes, remove it -- Yes, remove it
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1820166585"] = "Yes, remove it" UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1820166585"] = "Yes, remove it"
-- Number of sources
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T1848978959"] = "Number of sources"
-- Do you really want to edit this message? In order to edit this message, the AI response will be deleted. -- Do you really want to edit this message? In order to edit this message, the AI response will be deleted.
UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T2018431076"] = "Do you really want to edit this message? In order to edit this message, the AI response will be deleted." UI_TEXT_CONTENT["AISTUDIO::CHAT::CONTENTBLOCKCOMPONENT::T2018431076"] = "Do you really want to edit this message? In order to edit this message, the AI response will be deleted."
@ -4617,6 +4620,9 @@ UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1702902297"] = "Introduction"
-- Vision -- Vision
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1892426825"] = "Vision" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T1892426825"] = "Vision"
-- You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Perplexity, Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2183503084"] = "You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Perplexity, Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities."
-- Let's get started -- Let's get started
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2331588413"] = "Let's get started" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2331588413"] = "Let's get started"
@ -4626,9 +4632,6 @@ UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2348849647"] = "Last Changelog"
-- Choose the provider and model best suited for your current task. -- Choose the provider and model best suited for your current task.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2588488920"] = "Choose the provider and model best suited for your current task." UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2588488920"] = "Choose the provider and model best suited for your current task."
-- You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities.
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T2900280782"] = "You are not tied to any single provider. Instead, you might choose the provider that best suits your needs. Right now, we support OpenAI (GPT5, o1, etc.), Mistral, Anthropic (Claude), Google Gemini, xAI (Grok), DeepSeek, Alibaba Cloud (Qwen), Hugging Face, and self-hosted models using vLLM, llama.cpp, ollama, LM Studio, Groq, or Fireworks. For scientists and employees of research institutions, we also support Helmholtz and GWDG AI services. These are available through federated logins like eduGAIN to all 18 Helmholtz Centers, the Max Planck Society, most German, and many international universities."
-- Quick Start Guide -- Quick Start Guide
UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T3002014720"] = "Quick Start Guide" UI_TEXT_CONTENT["AISTUDIO::PAGES::HOME::T3002014720"] = "Quick Start Guide"
@ -4863,6 +4866,9 @@ UI_TEXT_CONTENT["AISTUDIO::PROVIDER::LLMPROVIDERSEXTENSIONS::T3424652889"] = "Un
-- no model selected -- no model selected
UI_TEXT_CONTENT["AISTUDIO::PROVIDER::MODEL::T2234274832"] = "no model selected" UI_TEXT_CONTENT["AISTUDIO::PROVIDER::MODEL::T2234274832"] = "no model selected"
-- Sources
UI_TEXT_CONTENT["AISTUDIO::PROVIDER::SOURCEEXTENSIONS::T2730980305"] = "Sources"
-- Use no chat template -- Use no chat template
UI_TEXT_CONTENT["AISTUDIO::SETTINGS::CHATTEMPLATE::T4258819635"] = "Use no chat template" UI_TEXT_CONTENT["AISTUDIO::SETTINGS::CHATTEMPLATE::T4258819635"] = "Use no chat template"

View File

@ -21,7 +21,7 @@ public sealed class ProviderAlibabaCloud(ILogger logger) : BaseProvider("https:/
public override string InstanceName { get; set; } = "AlibabaCloud"; public override string InstanceName { get; set; } = "AlibabaCloud";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -18,7 +18,7 @@ public sealed class ProviderAnthropic(ILogger logger) : BaseProvider("https://ap
public override string InstanceName { get; set; } = "Anthropic"; public override string InstanceName { get; set; } = "Anthropic";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -13,7 +13,7 @@ public readonly record struct ResponseStreamLine(string Type, int Index, Delta D
public bool ContainsContent() => this != default && !string.IsNullOrWhiteSpace(this.Delta.Text); public bool ContainsContent() => this != default && !string.IsNullOrWhiteSpace(this.Delta.Text);
/// <inheritdoc /> /// <inheritdoc />
public string GetContent() => this.Delta.Text; public ContentStreamChunk GetContent() => new(this.Delta.Text, []);
} }
/// <summary> /// <summary>

View File

@ -63,7 +63,7 @@ public abstract class BaseProvider : IProvider, ISecretId
public abstract string InstanceName { get; set; } public abstract string InstanceName { get; set; }
/// <inheritdoc /> /// <inheritdoc />
public abstract IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, CancellationToken token = default); public abstract IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, CancellationToken token = default);
/// <inheritdoc /> /// <inheritdoc />
public abstract IAsyncEnumerable<ImageURL> StreamImageCompletion(Model imageModel, string promptPositive, string promptNegative = FilterOperator.String.Empty, ImageURL referenceImageURL = default, CancellationToken token = default); public abstract IAsyncEnumerable<ImageURL> StreamImageCompletion(Model imageModel, string promptPositive, string promptNegative = FilterOperator.String.Empty, ImageURL referenceImageURL = default, CancellationToken token = default);
@ -96,7 +96,7 @@ public abstract class BaseProvider : IProvider, ISecretId
/// <param name="requestBuilder">A function that builds the request.</param> /// <param name="requestBuilder">A function that builds the request.</param>
/// <param name="token">The cancellation token.</param> /// <param name="token">The cancellation token.</param>
/// <returns>The status object of the request.</returns> /// <returns>The status object of the request.</returns>
protected async Task<HttpRateLimitedStreamResult> SendRequest(Func<Task<HttpRequestMessage>> requestBuilder, CancellationToken token = default) private async Task<HttpRateLimitedStreamResult> SendRequest(Func<Task<HttpRequestMessage>> requestBuilder, CancellationToken token = default)
{ {
const int MAX_RETRIES = 6; const int MAX_RETRIES = 6;
const double RETRY_DELAY_SECONDS = 4; const double RETRY_DELAY_SECONDS = 4;
@ -189,7 +189,7 @@ public abstract class BaseProvider : IProvider, ISecretId
return new HttpRateLimitedStreamResult(true, false, string.Empty, response); return new HttpRateLimitedStreamResult(true, false, string.Empty, response);
} }
protected async IAsyncEnumerable<string> StreamChatCompletionInternal<T>(string providerName, Func<Task<HttpRequestMessage>> requestBuilder, [EnumeratorCancellation] CancellationToken token = default) where T : struct, IResponseStreamLine protected async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletionInternal<T>(string providerName, Func<Task<HttpRequestMessage>> requestBuilder, [EnumeratorCancellation] CancellationToken token = default) where T : struct, IResponseStreamLine
{ {
StreamReader? streamReader = null; StreamReader? streamReader = null;
try try

View File

@ -0,0 +1,16 @@
namespace AIStudio.Provider;
/// <summary>
/// A chunk of content from a content stream, along with its associated sources.
/// </summary>
/// <param name="Content">The text content of the chunk.</param>
/// <param name="Sources">The list of sources associated with the chunk.</param>
public sealed record ContentStreamChunk(string Content, IList<ISource> Sources)
{
/// <summary>
/// Implicit conversion to string.
/// </summary>
/// <param name="chunk">The content stream chunk.</param>
/// <returns>The text content of the chunk.</returns>
public static implicit operator string(ContentStreamChunk chunk) => chunk.Content;
}

View File

@ -20,7 +20,7 @@ public sealed class ProviderDeepSeek(ILogger logger) : BaseProvider("https://api
public override string InstanceName { get; set; } = "DeepSeek"; public override string InstanceName { get; set; } = "DeepSeek";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -19,7 +19,7 @@ public class ProviderFireworks(ILogger logger) : BaseProvider("https://api.firew
public override string InstanceName { get; set; } = "Fireworks.ai"; public override string InstanceName { get; set; } = "Fireworks.ai";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -14,7 +14,7 @@ public readonly record struct ResponseStreamLine(string Id, string Object, uint
public bool ContainsContent() => this != default && this.Choices.Count > 0; public bool ContainsContent() => this != default && this.Choices.Count > 0;
/// <inheritdoc /> /// <inheritdoc />
public string GetContent() => this.Choices[0].Delta.Content; public ContentStreamChunk GetContent() => new(this.Choices[0].Delta.Content, []);
} }
/// <summary> /// <summary>

View File

@ -20,7 +20,7 @@ public sealed class ProviderGWDG(ILogger logger) : BaseProvider("https://chat-ai
public override string InstanceName { get; set; } = "GWDG SAIA"; public override string InstanceName { get; set; } = "GWDG SAIA";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -20,7 +20,7 @@ public class ProviderGoogle(ILogger logger) : BaseProvider("https://generativela
public override string InstanceName { get; set; } = "Google Gemini"; public override string InstanceName { get; set; } = "Google Gemini";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Provider.Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Provider.Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -20,7 +20,7 @@ public class ProviderGroq(ILogger logger) : BaseProvider("https://api.groq.com/o
public override string InstanceName { get; set; } = "Groq"; public override string InstanceName { get; set; } = "Groq";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -20,7 +20,7 @@ public sealed class ProviderHelmholtz(ILogger logger) : BaseProvider("https://ap
public override string InstanceName { get; set; } = "Helmholtz Blablador"; public override string InstanceName { get; set; } = "Helmholtz Blablador";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -25,7 +25,7 @@ public sealed class ProviderHuggingFace : BaseProvider
public override string InstanceName { get; set; } = "HuggingFace"; public override string InstanceName { get; set; } = "HuggingFace";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -27,7 +27,7 @@ public interface IProvider
/// <param name="settingsManager">The settings manager instance to use.</param> /// <param name="settingsManager">The settings manager instance to use.</param>
/// <param name="token">The cancellation token.</param> /// <param name="token">The cancellation token.</param>
/// <returns>The chat completion stream.</returns> /// <returns>The chat completion stream.</returns>
public IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, CancellationToken token = default); public IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, CancellationToken token = default);
/// <summary> /// <summary>
/// Starts an image completion stream. /// Starts an image completion stream.

View File

@ -12,5 +12,17 @@ public interface IResponseStreamLine
/// Gets the content of the response line. /// Gets the content of the response line.
/// </summary> /// </summary>
/// <returns>The content of the response line.</returns> /// <returns>The content of the response line.</returns>
public string GetContent(); public ContentStreamChunk GetContent();
/// <summary>
/// Checks if the response line contains any sources.
/// </summary>
/// <returns>True when the response line contains sources, false otherwise.</returns>
public bool ContainsSources() => false;
/// <summary>
/// Gets the sources of the response line.
/// </summary>
/// <returns>The sources of the response line.</returns>
public IList<ISource> GetSources() => [];
} }

View File

@ -0,0 +1,17 @@
namespace AIStudio.Provider;
/// <summary>
/// Data model for a source used in the response.
/// </summary>
public interface ISource
{
/// <summary>
/// The title of the source.
/// </summary>
public string Title { get; }
/// <summary>
/// The URL of the source.
/// </summary>
public string URL { get; }
}

View File

@ -14,6 +14,7 @@ public enum LLMProviders
X = 8, X = 8,
DEEP_SEEK = 11, DEEP_SEEK = 11,
ALIBABA_CLOUD = 12, ALIBABA_CLOUD = 12,
PERPLEXITY = 14,
FIREWORKS = 5, FIREWORKS = 5,
GROQ = 6, GROQ = 6,

View File

@ -9,6 +9,7 @@ using AIStudio.Provider.Helmholtz;
using AIStudio.Provider.HuggingFace; using AIStudio.Provider.HuggingFace;
using AIStudio.Provider.Mistral; using AIStudio.Provider.Mistral;
using AIStudio.Provider.OpenAI; using AIStudio.Provider.OpenAI;
using AIStudio.Provider.Perplexity;
using AIStudio.Provider.SelfHosted; using AIStudio.Provider.SelfHosted;
using AIStudio.Provider.X; using AIStudio.Provider.X;
using AIStudio.Settings; using AIStudio.Settings;
@ -38,6 +39,7 @@ public static class LLMProvidersExtensions
LLMProviders.X => "xAI", LLMProviders.X => "xAI",
LLMProviders.DEEP_SEEK => "DeepSeek", LLMProviders.DEEP_SEEK => "DeepSeek",
LLMProviders.ALIBABA_CLOUD => "Alibaba Cloud", LLMProviders.ALIBABA_CLOUD => "Alibaba Cloud",
LLMProviders.PERPLEXITY => "Perplexity",
LLMProviders.GROQ => "Groq", LLMProviders.GROQ => "Groq",
LLMProviders.FIREWORKS => "Fireworks.ai", LLMProviders.FIREWORKS => "Fireworks.ai",
@ -86,6 +88,8 @@ public static class LLMProvidersExtensions
LLMProviders.DEEP_SEEK => Confidence.CHINA_NO_TRAINING.WithRegion("Asia").WithSources("https://cdn.deepseek.com/policies/en-US/deepseek-open-platform-terms-of-service.html").WithLevel(settingsManager.GetConfiguredConfidenceLevel(llmProvider)), LLMProviders.DEEP_SEEK => Confidence.CHINA_NO_TRAINING.WithRegion("Asia").WithSources("https://cdn.deepseek.com/policies/en-US/deepseek-open-platform-terms-of-service.html").WithLevel(settingsManager.GetConfiguredConfidenceLevel(llmProvider)),
LLMProviders.ALIBABA_CLOUD => Confidence.CHINA_NO_TRAINING.WithRegion("Asia").WithSources("https://www.alibabacloud.com/help/en/model-studio/support/faq-about-alibaba-cloud-model-studio").WithLevel(settingsManager.GetConfiguredConfidenceLevel(llmProvider)), LLMProviders.ALIBABA_CLOUD => Confidence.CHINA_NO_TRAINING.WithRegion("Asia").WithSources("https://www.alibabacloud.com/help/en/model-studio/support/faq-about-alibaba-cloud-model-studio").WithLevel(settingsManager.GetConfiguredConfidenceLevel(llmProvider)),
LLMProviders.PERPLEXITY => Confidence.USA_NO_TRAINING.WithRegion("America, U.S.").WithSources("https://www.perplexity.ai/hub/legal/perplexity-api-terms-of-service").WithLevel(settingsManager.GetConfiguredConfidenceLevel(llmProvider)),
LLMProviders.SELF_HOSTED => Confidence.SELF_HOSTED.WithLevel(settingsManager.GetConfiguredConfidenceLevel(llmProvider)), LLMProviders.SELF_HOSTED => Confidence.SELF_HOSTED.WithLevel(settingsManager.GetConfiguredConfidenceLevel(llmProvider)),
@ -121,6 +125,7 @@ public static class LLMProvidersExtensions
LLMProviders.GWDG => false, LLMProviders.GWDG => false,
LLMProviders.DEEP_SEEK => false, LLMProviders.DEEP_SEEK => false,
LLMProviders.HUGGINGFACE => false, LLMProviders.HUGGINGFACE => false,
LLMProviders.PERPLEXITY => false,
// //
// Self-hosted providers are treated as a special case anyway. // Self-hosted providers are treated as a special case anyway.
@ -165,6 +170,7 @@ public static class LLMProvidersExtensions
LLMProviders.X => new ProviderX(logger) { InstanceName = instanceName }, LLMProviders.X => new ProviderX(logger) { InstanceName = instanceName },
LLMProviders.DEEP_SEEK => new ProviderDeepSeek(logger) { InstanceName = instanceName }, LLMProviders.DEEP_SEEK => new ProviderDeepSeek(logger) { InstanceName = instanceName },
LLMProviders.ALIBABA_CLOUD => new ProviderAlibabaCloud(logger) { InstanceName = instanceName }, LLMProviders.ALIBABA_CLOUD => new ProviderAlibabaCloud(logger) { InstanceName = instanceName },
LLMProviders.PERPLEXITY => new ProviderPerplexity(logger) { InstanceName = instanceName },
LLMProviders.GROQ => new ProviderGroq(logger) { InstanceName = instanceName }, LLMProviders.GROQ => new ProviderGroq(logger) { InstanceName = instanceName },
LLMProviders.FIREWORKS => new ProviderFireworks(logger) { InstanceName = instanceName }, LLMProviders.FIREWORKS => new ProviderFireworks(logger) { InstanceName = instanceName },
@ -194,6 +200,7 @@ public static class LLMProvidersExtensions
LLMProviders.X => "https://accounts.x.ai/sign-up", LLMProviders.X => "https://accounts.x.ai/sign-up",
LLMProviders.DEEP_SEEK => "https://platform.deepseek.com/sign_up", LLMProviders.DEEP_SEEK => "https://platform.deepseek.com/sign_up",
LLMProviders.ALIBABA_CLOUD => "https://account.alibabacloud.com/register/intl_register.htm", LLMProviders.ALIBABA_CLOUD => "https://account.alibabacloud.com/register/intl_register.htm",
LLMProviders.PERPLEXITY => "https://www.perplexity.ai/account/api",
LLMProviders.GROQ => "https://console.groq.com/", LLMProviders.GROQ => "https://console.groq.com/",
LLMProviders.FIREWORKS => "https://fireworks.ai/login", LLMProviders.FIREWORKS => "https://fireworks.ai/login",
@ -216,6 +223,7 @@ public static class LLMProvidersExtensions
LLMProviders.FIREWORKS => "https://fireworks.ai/account/billing", LLMProviders.FIREWORKS => "https://fireworks.ai/account/billing",
LLMProviders.DEEP_SEEK => "https://platform.deepseek.com/usage", LLMProviders.DEEP_SEEK => "https://platform.deepseek.com/usage",
LLMProviders.ALIBABA_CLOUD => "https://usercenter2-intl.aliyun.com/billing", LLMProviders.ALIBABA_CLOUD => "https://usercenter2-intl.aliyun.com/billing",
LLMProviders.PERPLEXITY => "https://www.perplexity.ai/account/api/",
LLMProviders.HUGGINGFACE => "https://huggingface.co/settings/billing", LLMProviders.HUGGINGFACE => "https://huggingface.co/settings/billing",
_ => string.Empty, _ => string.Empty,
@ -232,6 +240,7 @@ public static class LLMProvidersExtensions
LLMProviders.GOOGLE => true, LLMProviders.GOOGLE => true,
LLMProviders.DEEP_SEEK => true, LLMProviders.DEEP_SEEK => true,
LLMProviders.ALIBABA_CLOUD => true, LLMProviders.ALIBABA_CLOUD => true,
LLMProviders.PERPLEXITY => true,
LLMProviders.HUGGINGFACE => true, LLMProviders.HUGGINGFACE => true,
_ => false, _ => false,
@ -278,6 +287,7 @@ public static class LLMProvidersExtensions
LLMProviders.X => true, LLMProviders.X => true,
LLMProviders.DEEP_SEEK => true, LLMProviders.DEEP_SEEK => true,
LLMProviders.ALIBABA_CLOUD => true, LLMProviders.ALIBABA_CLOUD => true,
LLMProviders.PERPLEXITY => true,
LLMProviders.GROQ => true, LLMProviders.GROQ => true,
LLMProviders.FIREWORKS => true, LLMProviders.FIREWORKS => true,
@ -299,6 +309,7 @@ public static class LLMProvidersExtensions
LLMProviders.X => true, LLMProviders.X => true,
LLMProviders.DEEP_SEEK => true, LLMProviders.DEEP_SEEK => true,
LLMProviders.ALIBABA_CLOUD => true, LLMProviders.ALIBABA_CLOUD => true,
LLMProviders.PERPLEXITY => true,
LLMProviders.GROQ => true, LLMProviders.GROQ => true,
LLMProviders.FIREWORKS => true, LLMProviders.FIREWORKS => true,

View File

@ -18,7 +18,7 @@ public sealed class ProviderMistral(ILogger logger) : BaseProvider("https://api.
public override string InstanceName { get; set; } = "Mistral"; public override string InstanceName { get; set; } = "Mistral";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Provider.Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Provider.Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -19,7 +19,7 @@ public class NoProvider : IProvider
public Task<IEnumerable<Model>> GetEmbeddingModels(string? apiKeyProvisional = null, CancellationToken token = default) => Task.FromResult<IEnumerable<Model>>([]); public Task<IEnumerable<Model>> GetEmbeddingModels(string? apiKeyProvisional = null, CancellationToken token = default) => Task.FromResult<IEnumerable<Model>>([]);
public async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatChatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatChatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
await Task.FromResult(0); await Task.FromResult(0);
yield break; yield break;

View File

@ -22,7 +22,7 @@ public sealed class ProviderOpenAI(ILogger logger) : BaseProvider("https://api.o
public override string InstanceName { get; set; } = "OpenAI"; public override string InstanceName { get; set; } = "OpenAI";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -15,7 +15,7 @@ public readonly record struct ResponseStreamLine(string Id, string Object, uint
public bool ContainsContent() => this != default && this.Choices.Count > 0; public bool ContainsContent() => this != default && this.Choices.Count > 0;
/// <inheritdoc /> /// <inheritdoc />
public string GetContent() => this.Choices[0].Delta.Content; public ContentStreamChunk GetContent() => new(this.Choices[0].Delta.Content, []);
} }
/// <summary> /// <summary>

View File

@ -0,0 +1,148 @@
using System.Net.Http.Headers;
using System.Runtime.CompilerServices;
using System.Text;
using System.Text.Json;
using AIStudio.Chat;
using AIStudio.Provider.OpenAI;
using AIStudio.Settings;
namespace AIStudio.Provider.Perplexity;
public sealed class ProviderPerplexity(ILogger logger) : BaseProvider("https://api.perplexity.ai/", logger)
{
private static readonly Model[] KNOWN_MODELS =
[
new("sonar", "Sonar"),
new("sonar-pro", "Sonar Pro"),
new("sonar-reasoning", "Sonar Reasoning"),
new("sonar-reasoning-pro", "Sonar Reasoning Pro"),
new("sonar-deep-research", "Sonar Deep Research"),
];
#region Implementation of IProvider
/// <inheritdoc />
public override string Id => LLMProviders.PERPLEXITY.ToName();
/// <inheritdoc />
public override string InstanceName { get; set; } = "Perplexity";
/// <inheritdoc />
public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{
// Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this);
if(!requestedSecret.Success)
yield break;
// Prepare the system prompt:
var systemPrompt = new Message
{
Role = "system",
Content = chatThread.PrepareSystemPrompt(settingsManager, chatThread, this.logger),
};
// Prepare the Perplexity HTTP chat request:
var perplexityChatRequest = JsonSerializer.Serialize(new ChatRequest
{
Model = chatModel.Id,
// Build the messages:
// - First of all the system prompt
// - Then none-empty user and AI messages
Messages = [systemPrompt, ..chatThread.Blocks.Where(n => n.ContentType is ContentType.TEXT && !string.IsNullOrWhiteSpace((n.Content as ContentText)?.Text)).Select(n => new Message
{
Role = n.Role switch
{
ChatRole.USER => "user",
ChatRole.AI => "assistant",
ChatRole.AGENT => "assistant",
ChatRole.SYSTEM => "system",
_ => "user",
},
Content = n.Content switch
{
ContentText text => text.Text,
_ => string.Empty,
}
}).ToList()],
Stream = true,
}, JSON_SERIALIZER_OPTIONS);
async Task<HttpRequestMessage> RequestBuilder()
{
// Build the HTTP post request:
var request = new HttpRequestMessage(HttpMethod.Post, "chat/completions");
// Set the authorization header:
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", await requestedSecret.Secret.Decrypt(ENCRYPTION));
// Set the content:
request.Content = new StringContent(perplexityChatRequest, Encoding.UTF8, "application/json");
return request;
}
await foreach (var content in this.StreamChatCompletionInternal<ResponseStreamLine>("Perplexity", RequestBuilder, token))
yield return content;
}
#pragma warning disable CS1998 // Async method lacks 'await' operators and will run synchronously
/// <inheritdoc />
public override async IAsyncEnumerable<ImageURL> StreamImageCompletion(Model imageModel, string promptPositive, string promptNegative = FilterOperator.String.Empty, ImageURL referenceImageURL = default, [EnumeratorCancellation] CancellationToken token = default)
{
yield break;
}
#pragma warning restore CS1998 // Async method lacks 'await' operators and will run synchronously
/// <inheritdoc />
public override Task<IEnumerable<Model>> GetTextModels(string? apiKeyProvisional = null, CancellationToken token = default)
{
return this.LoadModels();
}
/// <inheritdoc />
public override Task<IEnumerable<Model>> GetImageModels(string? apiKeyProvisional = null, CancellationToken token = default)
{
return Task.FromResult(Enumerable.Empty<Model>());
}
/// <inheritdoc />
public override Task<IEnumerable<Model>> GetEmbeddingModels(string? apiKeyProvisional = null, CancellationToken token = default)
{
return Task.FromResult(Enumerable.Empty<Model>());
}
public override IReadOnlyCollection<Capability> GetModelCapabilities(Model model)
{
var modelName = model.Id.ToLowerInvariant().AsSpan();
if(modelName.IndexOf("reasoning") is not -1 ||
modelName.IndexOf("deep-research") is not -1)
return
[
Capability.TEXT_INPUT,
Capability.MULTIPLE_IMAGE_INPUT,
Capability.TEXT_OUTPUT,
Capability.IMAGE_OUTPUT,
Capability.ALWAYS_REASONING,
];
return
[
Capability.TEXT_INPUT,
Capability.MULTIPLE_IMAGE_INPUT,
Capability.TEXT_OUTPUT,
Capability.IMAGE_OUTPUT,
];
}
#endregion
private Task<IEnumerable<Model>> LoadModels() => Task.FromResult<IEnumerable<Model>>(KNOWN_MODELS);
}

View File

@ -0,0 +1,45 @@
namespace AIStudio.Provider.Perplexity;
/// <summary>
/// Data model for a line in the response stream, for streaming completions.
/// </summary>
/// <param name="Id">The id of the response.</param>
/// <param name="Object">The object describing the response.</param>
/// <param name="Created">The timestamp of the response.</param>
/// <param name="Model">The model used for the response.</param>
/// <param name="SystemFingerprint">The system fingerprint; together with the seed, this allows you to reproduce the response.</param>
/// <param name="Choices">The choices made by the AI.</param>
public readonly record struct ResponseStreamLine(string Id, string Object, uint Created, string Model, string SystemFingerprint, IList<Choice> Choices, IList<SearchResult> SearchResults) : IResponseStreamLine
{
/// <inheritdoc />
public bool ContainsContent() => this != default && this.Choices.Count > 0;
/// <inheritdoc />
public ContentStreamChunk GetContent() => new(this.Choices[0].Delta.Content, this.GetSources());
/// <inheritdoc />
public bool ContainsSources() => this != default && this.SearchResults.Count > 0;
/// <inheritdoc />
public IList<ISource> GetSources() => this.SearchResults.Cast<ISource>().ToList();
}
/// <summary>
/// Data model for a choice made by the AI.
/// </summary>
/// <param name="Index">The index of the choice.</param>
/// <param name="Delta">The delta text of the choice.</param>
public readonly record struct Choice(int Index, Delta Delta);
/// <summary>
/// The delta text of a choice.
/// </summary>
/// <param name="Content">The content of the delta text.</param>
public readonly record struct Delta(string Content);
/// <summary>
/// Data model for a search result.
/// </summary>
/// <param name="Title">The title of the search result.</param>
/// <param name="URL">The URL of the search result.</param>
public sealed record SearchResult(string Title, string URL) : Source(Title, URL);

View File

@ -18,7 +18,7 @@ public sealed class ProviderSelfHosted(ILogger logger, Host host, string hostnam
public override string InstanceName { get; set; } = "Self-hosted"; public override string InstanceName { get; set; } = "Self-hosted";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Provider.Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Provider.Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this, isTrying: true); var requestedSecret = await RUST_SERVICE.GetAPIKey(this, isTrying: true);

View File

@ -0,0 +1,8 @@
namespace AIStudio.Provider;
/// <summary>
/// Data model for a source used in the response.
/// </summary>
/// <param name="Title">The title of the source.</param>
/// <param name="URL">The URL of the source.</param>
public record Source(string Title, string URL) : ISource;

View File

@ -0,0 +1,47 @@
using System.Text;
using AIStudio.Tools.PluginSystem;
namespace AIStudio.Provider;
public static class SourceExtensions
{
private static string TB(string fallbackEN) => I18N.I.T(fallbackEN, typeof(SourceExtensions).Namespace, nameof(SourceExtensions));
/// <summary>
/// Converts a list of sources to a markdown-formatted string.
/// </summary>
/// <param name="sources">The list of sources to convert.</param>
/// <returns>A markdown-formatted string representing the sources.</returns>
public static string ToMarkdown(this IList<Source> sources)
{
var sb = new StringBuilder();
sb.Append("## ");
sb.AppendLine(TB("Sources"));
var sourceNum = 0;
foreach (var source in sources)
{
sb.Append($"- [{++sourceNum}] ");
sb.Append('[');
sb.Append(source.Title);
sb.Append("](");
sb.Append(source.URL);
sb.AppendLine(")");
}
return sb.ToString();
}
/// <summary>
/// Merges a list of added sources into an existing list of sources, avoiding duplicates based on URL and Title.
/// </summary>
/// <param name="sources">The existing list of sources to merge into.</param>
/// <param name="addedSources">The list of sources to add.</param>
public static void MergeSources(this IList<Source> sources, IList<ISource> addedSources)
{
foreach (var addedSource in addedSources)
if (sources.All(s => s.URL != addedSource.URL && s.Title != addedSource.Title))
sources.Add((Source)addedSource);
}
}

View File

@ -20,7 +20,7 @@ public sealed class ProviderX(ILogger logger) : BaseProvider("https://api.x.ai/v
public override string InstanceName { get; set; } = "xAI"; public override string InstanceName { get; set; } = "xAI";
/// <inheritdoc /> /// <inheritdoc />
public override async IAsyncEnumerable<string> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default) public override async IAsyncEnumerable<ContentStreamChunk> StreamChatCompletion(Model chatModel, ChatThread chatThread, SettingsManager settingsManager, [EnumeratorCancellation] CancellationToken token = default)
{ {
// Get the API key: // Get the API key:
var requestedSecret = await RUST_SERVICE.GetAPIKey(this); var requestedSecret = await RUST_SERVICE.GetAPIKey(this);

View File

@ -140,4 +140,9 @@
.no-elevation { .no-elevation {
box-shadow: none !important; box-shadow: none !important;
}
.sources-card-header {
top: 0em !important;
left: 2.2em !important;
} }

View File

@ -2,6 +2,8 @@
- Added support for predefined chat templates in configuration plugins to help enterprises roll out consistent templates across the organization. - Added support for predefined chat templates in configuration plugins to help enterprises roll out consistent templates across the organization.
- Added the ability to choose between automatic and manual update installation to the app settings (default is manual). - Added the ability to choose between automatic and manual update installation to the app settings (default is manual).
- Added the ability to control the update installation behavior by configuration plugins. - Added the ability to control the update installation behavior by configuration plugins.
- Added the option for LLM providers to stream citations or sources.
- Added support for citations to the chat interface. This feature is invisible unless an LLM model is streaming citations or sources.
- Improved memory usage in several areas of the app. - Improved memory usage in several areas of the app.
- Improved plugin management for configuration plugins so that hot reload detects when a provider or chat template has been removed. - Improved plugin management for configuration plugins so that hot reload detects when a provider or chat template has been removed.
- Improved the dialog for naming chats and workspaces to ensure valid inputs are entered. - Improved the dialog for naming chats and workspaces to ensure valid inputs are entered.