Files
crewAI/lib
Devin AI 9c5b68f1c5 feat: Add OpenAI Responses API integration
Implements native support for OpenAI's Responses API (/v1/responses) as a new
LLM provider in CrewAI. This addresses feature request #4152.

Key features:
- New OpenAIResponsesCompletion class extending BaseLLM
- Support for both explicit provider parameter and model prefix routing
- Message conversion from CrewAI format to Responses API format
- Tool/function calling support
- Streaming support (sync and async)
- Structured output via Pydantic models
- Token usage tracking
- Support for o-series reasoning models with reasoning_effort parameter
- Support for stateful conversations via previous_response_id

Usage:
  # Option 1: Using provider parameter
  llm = LLM(model='gpt-4o', provider='openai_responses')

  # Option 2: Using model prefix
  llm = LLM(model='openai_responses/gpt-4o')

Includes comprehensive test coverage for:
- Provider routing
- Message conversion
- Tool conversion
- API calls
- Parameter preparation
- Context window sizes
- Feature support methods
- Token usage extraction

Co-Authored-By: João <joao@crewai.com>
2025-12-27 06:19:28 +00:00
..
2025-12-19 15:47:00 -05:00
2025-12-19 15:47:00 -05:00