- Added `StreamStartStep` and `StreamFinishStep` classes to manage the lifecycle of individual LLM API calls within a message.
- Updated `stream_chat_completion` to yield step events, enhancing the ability to visually separate multiple LLM calls.
- Refactored the handling of start and finish events to accommodate the new step lifecycle, improving state management during streaming.
- Adjusted the `stream_registry` to recognize and process the new step events.