Files
AutoGPT/autogpt_platform/frontend/Dockerfile
Abhimanyu Yadav 7d4c020a9b feat(chat): implement AI SDK integration with custom streaming response handling (#11901)
### Changes 🏗️

- Added AI SDK integration for chat streaming with proper message
handling
- Implemented custom to_sse method in StreamToolOutputAvailable to
exclude non-spec fields
- Modified stream_chat_completion to reuse message IDs for tool call
continuations
- Created new Copilot 2.0 UI with AI SDK React components
- Added streamdown and related packages for markdown rendering
- Built reusable conversation and message components for the chat
interface
- Added support for tool output display in the chat UI

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Start a new chat session and verify streaming works correctly
  - [x] Test tool calls and verify they display properly in the UI
  - [x] Verify message continuations don't create duplicate messages
  - [x] Test markdown rendering with code blocks and other formatting
  - [x] Verify the UI is responsive and scrolls correctly

#### For configuration changes:

- [x] `.env.default` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

---------

Co-authored-by: Lluis Agusti <hi@llu.lu>
Co-authored-by: Ubbe <hi@ubbe.dev>
2026-02-10 21:12:21 +08:00

56 lines
2.1 KiB
Docker

# Base stage for both dev and prod
FROM node:21-alpine AS base
WORKDIR /app
RUN corepack enable
COPY autogpt_platform/frontend/package.json autogpt_platform/frontend/pnpm-lock.yaml ./
RUN --mount=type=cache,target=/root/.local/share/pnpm pnpm install --frozen-lockfile
# Build stage for prod
FROM base AS build
COPY autogpt_platform/frontend/ .
# Allow CI to opt-in to Playwright test build-time flags
ARG NEXT_PUBLIC_PW_TEST="false"
ENV NEXT_PUBLIC_PW_TEST=$NEXT_PUBLIC_PW_TEST
ENV NODE_ENV="production"
# Merge env files appropriately based on environment
RUN if [ -f .env.production ]; then \
# In CI/CD: merge defaults with production (production takes precedence)
cat .env.default .env.production > .env.merged && mv .env.merged .env.production; \
elif [ -f .env ]; then \
# Local with custom .env: merge defaults with .env
cat .env.default .env > .env.merged && mv .env.merged .env; \
else \
# Local without custom .env: use defaults
cp .env.default .env; \
fi
RUN pnpm run generate:api
# Disable source-map generation in Docker builds to halve webpack memory usage.
# Source maps are only useful when SENTRY_AUTH_TOKEN is set (Vercel deploys);
# the Docker image never uploads them, so generating them just wastes RAM.
ENV NEXT_PUBLIC_SOURCEMAPS="false"
# In CI, we want NEXT_PUBLIC_PW_TEST=true during build so Next.js inlines it
RUN if [ "$NEXT_PUBLIC_PW_TEST" = "true" ]; then NEXT_PUBLIC_PW_TEST=true NODE_OPTIONS="--max-old-space-size=8192" pnpm build; else NODE_OPTIONS="--max-old-space-size=8192" pnpm build; fi
# Prod stage - based on NextJS reference Dockerfile https://github.com/vercel/next.js/blob/64271354533ed16da51be5dce85f0dbd15f17517/examples/with-docker/Dockerfile
FROM node:21-alpine AS prod
ENV NODE_ENV=production
ENV HOSTNAME=0.0.0.0
WORKDIR /app
RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs
RUN mkdir .next
RUN chown nextjs:nodejs .next
COPY --from=build --chown=nextjs:nodejs /app/.next/standalone ./
COPY --from=build --chown=nextjs:nodejs /app/.next/static ./.next/static
COPY --from=build /app/public ./public
USER nextjs
EXPOSE 3000
CMD ["node", "server.js"]