Fixed OpenAI Responses API by not discarding whitespace (#661)
Some checks are pending
Build and Release / Read metadata (push) Waiting to run
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-apple-darwin, osx-arm64, macos-latest, aarch64-apple-darwin, dmg updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-pc-windows-msvc.exe, win-arm64, windows-latest, aarch64-pc-windows-msvc, nsis updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-apple-darwin, osx-x64, macos-latest, x86_64-apple-darwin, dmg updater) (push) Blocked by required conditions
Build and Release / Prepare & create release (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-aarch64-unknown-linux-gnu, linux-arm64, ubuntu-22.04-arm, aarch64-unknown-linux-gnu, appimage deb updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-pc-windows-msvc.exe, win-x64, windows-latest, x86_64-pc-windows-msvc, nsis updater) (push) Blocked by required conditions
Build and Release / Build app (${{ matrix.dotnet_runtime }}) (-x86_64-unknown-linux-gnu, linux-x64, ubuntu-22.04, x86_64-unknown-linux-gnu, appimage deb updater) (push) Blocked by required conditions
Build and Release / Publish release (push) Blocked by required conditions

Co-authored-by: Thorsten Sommer <SommerEngineering@users.noreply.github.com>
This commit is contained in:
Oliver Kunc 2026-02-13 21:37:36 +01:00 committed by GitHub
parent ea4e3f0199
commit 48f8cb3285
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 4 additions and 3 deletions

View File

@ -7,15 +7,15 @@ namespace AIStudio.Provider.OpenAI;
/// <param name="Delta">The delta content of the response.</param>
public record ResponsesDeltaStreamLine(
string Type,
string Delta) : IResponseStreamLine
string? Delta) : IResponseStreamLine
{
#region Implementation of IResponseStreamLine
/// <inheritdoc />
public bool ContainsContent() => !string.IsNullOrWhiteSpace(this.Delta);
public bool ContainsContent() => this.Delta is not null;
/// <inheritdoc />
public ContentStreamChunk GetContent() => new(this.Delta, this.GetSources());
public ContentStreamChunk GetContent() => new(this.Delta ?? string.Empty, this.GetSources());
//
// Please note that there are multiple options where LLM providers might stream sources:

View File

@ -7,4 +7,5 @@
- Improved the workspaces experience by using a different color for the delete button to avoid confusion.
- Improved the plugins page by adding an action to open the plugin source link. The action opens website URLs in an external browser, supports `mailto:` links for direct email composition.
- Fixed an issue where manually saving chats in workspace manual-storage mode could appear unreliable during response streaming. The save button is now disabled while streaming to prevent partial saves.
- Fixed a bug in the Responses API of our OpenAI provider implementation where streamed whitespace chunks were discarded. We thank Oliver Kunc `OliverKunc` for his first contribution in resolving this issue. We appreciate your help, Oliver.
- Upgraded dependencies.