Changelog
All notable changes to TurboStream will be documented in this file. The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
Version 0.3.1
LatestReleased on February 8, 2026
Added
Modern API Documentation
- Replaced Swagger UI with Scalar for modern, interactive API documentation
- New documentation endpoint at
http://localhost:7210/docs - Enhanced request/response visualization with better schema examples
- Improved browser compatibility with ad-blocker support
- Separate Content Security Policy for documentation and API endpoints
Enhanced Topic Extraction
- Root-level array indexing:
[3]extracts element at index 3 from root arrays - Nested array access:
data.path[-1]accesses last element in nested structures - Combined path expressions:
field[0].subfieldfor complex data navigation - Comprehensive logging for extraction debugging and troubleshooting
API Improvements
- Complete OpenAPI specification for API key management endpoints
- Documented create, list, and revoke API key operations
- Consolidated topic prompt configuration in feed model
- Enhanced WebSocket authentication logging
Changed
Documentation Updates
- Renamed
swagger.jsontoopenapi.jsonfor clarity - Updated all documentation references and endpoints
- Added browser compatibility notes for ad-blockers
- Improved API reference structure and examples
Security Headers
- Context-aware Content Security Policy (CSP) enforcement
- Permissive CSP for documentation pages to enable API testing
- Strict CSP for API endpoints to prevent XSS and injection attacks
- Enhanced frame protection and content type validation
Fixed
- Fixed Content Security Policy blocking Scalar documentation functionality
- Resolved MongoDB index conflict errors during API key service initialization
- Corrected security header middleware for documentation pages
- Fixed topic extraction for root-level array data structures
Removed
- Removed redundant topic-prompt management endpoints (replaced by topicPrompts map)
- Consolidated deprecated route handlers
Version 0.3.0
Released on January 30, 2026
Added
Multi-Topic Architecture
- Topic-based message routing using configurable field identifiers
- Topic Router service for message distribution to topic-specific channels
- Per-topic context store with independent rolling windows (50 entries default)
- Per-topic analysis memory with conversation continuity (3 Q&A pairs default)
- Compound key architecture (
feedID:topic) for complete data isolation - Support for unlimited topics per feed through single WebSocket connection
- Topic subscription filtering for selective intelligence delivery
Stateful Intelligence System
- Continuous per-topic LLM analysis with configurable intervals (default: 10s)
- Topic LLM Scheduler service for automated query loop management
- State memory service with rolling window and state token compression
- Analysis memory for conversation continuity across queries
- TSLN (Time-Series Lean Notation) format integration for 40-60% token reduction
- Thread-safe concurrent access to context and memory stores
Intelligence Broadcasting
- LLM intelligence delivery to consumers (not raw feed data)
- Topic-specific WebSocket room broadcasting
- Real-time intelligence streaming with
llm-intelligencemessage type - Callback-based architecture for flexible delivery mechanisms
API-First Foundation
- REST API framework for feed management
- WebSocket API for real-time intelligence streaming
- API key authentication infrastructure
- Multi-tenant namespace isolation
- Usage tracking and metering foundation
- Rate limiting framework
- Comprehensive API documentation at https://turboline.ai/docs/api
Documentation
- Complete architecture documentation suite
- Per-topic implementation guides
- Client integration guides
- Performance optimization strategies
- Testing and validation procedures
Changed
Repository Structure
- Terminal UI extracted into separate standalone repository: https://github.com/turboline-ai/turbostream-tui
- Backend reorganized as pure Intelligence-as-a-Service platform
- Backend code consolidated in root
turbostreamfolder - Clear client-server separation for independent development
Feed Model
- Extended with
topicFieldfor topic identification - Added
topicsarray for multi-topic configuration - Added
enableTopicRoutingflag for feature control - Backward compatible with single-topic feeds
WebSocket Manager
- Refactored to support topic-based routing and broadcasting
- Integrated Topic LLM Scheduler lifecycle management
- Enhanced message handling for intelligence delivery
- Improved connection and reconnection logic
LLM Service
- Adapted context and memory stores for per-topic architecture
- Implemented compound key pattern for topic isolation
- Enhanced thread safety for concurrent topic access
- Optimized token usage through TSLN format
Fixed
- Thread safety issues in concurrent feed context access
- Memory leaks in topic router channel management
- State memory corruption in high-concurrency scenarios
- WebSocket reconnection race conditions
Deprecated
- Monolithic TUI-Backend coupling (superseded by client-server architecture)
- Single-topic-only architecture (superseded by multi-topic support)
- Direct TUI backend access (moving to API-mediated access)
Removed
- Terminal UI code from backend repository (moved to https://github.com/turboline-ai/turbostream-tui)
- Legacy single-topic-only code paths
- Deprecated internal TUI interfaces
Security
- Per-topic data isolation through compound keys
- Thread-safe concurrent access to shared resources
- API key authentication framework
- Customer namespace isolation foundation
Performance
- 95% infrastructure cost reduction through connection multiplexing
- 40-60% LLM token reduction through TSLN optimization
- Support for 100+ concurrent topics per feed
- 18,000 LLM queries/hour throughput (50 topics)
- 29 KB memory footprint per topic
Version 0.2.x - Previous Versions
Prior to v0.3.0, TurboStream operated as a monolithic application with Terminal UI and backend tightly coupled. Version history for 0.2.x releases focused on single-topic feed processing and direct TUI integration.
Migration Notes
Upgrading from 0.3.0 to 0.3.1
Backend Update:
- Pull latest changes:
git pull origin main - Update dependencies:
go mod tidy - Build and test:
go build ./... && go test ./... -v - Restart server to access new documentation at
http://localhost:7210/docs
Breaking Changes:
None. All changes are backwards compatible. Existing feeds and API integrations continue to work without modification.
Documentation Access:
Swagger UI endpoint (/swagger/index.html) redirects to new Scalar endpoint (/docs). If using an ad-blocker, you may need to disable it for localhost:7210 to test API calls.
Upgrading from 0.2.x to 0.3.0
Backend Services:
- Backup existing feed configurations
- Update dependencies:
go mod tidy - Build and test:
go build ./... && go test ./... -v - Deploy new binary
- Verify existing feeds still connect
Terminal UI:
- Clone standalone TUI repository: https://github.com/turboline-ai/turbostream-tui
- Configure backend API endpoint
- Build and run TUI client
- Verify monitoring capabilities
Single-Topic Feeds:
No changes required. Feeds continue to work with existing configuration.
Multi-Topic Migration:
Add topicField, topics, and enableTopicRouting fields to feed configuration.
Resources
- •API Documentation: https://turboline.ai/docs/api
- •TUI Repository: https://github.com/turboline-ai/turbostream-tui
- •Issues: https://github.com/turboline-ai/turbostream/issues