Skip to main content

Platform - 0.65.0

πŸš€ Highlights​

  • Major refactor of the AI Chats API to improve reliability, extensibility, and alignment with the OpenAI API
  • Streaming AI responses now supported for real-time content generation
  • Backend now supports feature flags for safer rollouts
  • New APIs added to update job names and notes

✨ New Features​

  • Streaming AI responses
    AI-generated content can now be streamed as it is produced, enabling more responsive user experiences.

  • Client + server tool support
    The AI API now supports client-side tools in addition to server-side tools, unlocking more flexible AI workflows.

  • Job metadata API
    New endpoints allow updating job names and notes

  • Backend feature flags
    Feature flag support added in the platform (in addition to the Workshop) to progressively enable or disable features without redeploying.


πŸ›  Improvements​

  • AI API aligned with OpenAI’s canonical format
    The AI Chats API now uses a messages/roles structure consistent with the OpenAI API, simplifying integrations and future compatibility.

  • Improved editor context handling
    Editor context is now explicitly passed and correctly processed by the model.

  • Usage tracking per conversation


πŸ› Fixes​

  • Streaming endpoints now return errors correctly instead of silently failing.
  • Improved request lifecycle handling to prevent orphaned or hanging AI requests.

βš™οΈ Internal / Technical Changes​

  • Refactored AI tools container injection into a dedicated module.
  • Added graceful cancellation of in-flight AI requests when the API shuts down (via AsyncDisposable callbacks).
  • Client package updated to match the new AI API behavior and contracts.

⚠️ Breaking Changes​

  • AI Chats API has been significantly refactored
    • Request and response formats have changed
    • System prompt behavior is different
    • Streaming behavior is now first-class

⚠️ This will break the existing AI Chat feature in production until the Workshop is updated.