Platform - 0.65.0
π Highlightsβ
- Major refactor of the AI Chats API to improve reliability, extensibility, and alignment with the OpenAI API
- Streaming AI responses now supported for real-time content generation
- Backend now supports feature flags for safer rollouts
- New APIs added to update job names and notes
β¨ New Featuresβ
-
Streaming AI responses
AI-generated content can now be streamed as it is produced, enabling more responsive user experiences. -
Client + server tool support
The AI API now supports client-side tools in addition to server-side tools, unlocking more flexible AI workflows. -
Job metadata API
New endpoints allow updating job names and notes -
Backend feature flags
Feature flag support added in the platform (in addition to the Workshop) to progressively enable or disable features without redeploying.
π Improvementsβ
-
AI API aligned with OpenAIβs canonical format
The AI Chats API now uses a messages/roles structure consistent with the OpenAI API, simplifying integrations and future compatibility. -
Improved editor context handling
Editor context is now explicitly passed and correctly processed by the model. -
Usage tracking per conversation
π Fixesβ
- Streaming endpoints now return errors correctly instead of silently failing.
- Improved request lifecycle handling to prevent orphaned or hanging AI requests.
βοΈ Internal / Technical Changesβ
- Refactored AI tools container injection into a dedicated module.
- Added graceful cancellation of in-flight AI requests when the API shuts down (via
AsyncDisposablecallbacks). - Client package updated to match the new AI API behavior and contracts.
β οΈ Breaking Changesβ
- AI Chats API has been significantly refactored
- Request and response formats have changed
- System prompt behavior is different
- Streaming behavior is now first-class
β οΈ This will break the existing AI Chat feature in production until the Workshop is updated.