Compare commits

..

13 Commits

Author SHA1 Message Date
Zamil Majdy
dfd7c64068 feat(backend): Implement node-specific auto-approval using key pattern
- Add auto-approval via special nodeExecId key pattern (auto_approve_{graph_exec_id}_{node_id})
- Create auto-approval records in PendingHumanReview when user approves with auto-approve flag
- Check for existing auto-approval before requiring human review
- Remove node_id parameter from get_or_create_human_review
- Load graph settings properly when resuming execution after review
2026-01-21 22:21:00 -05:00
Zamil Majdy
02089bc047 fix(frontend): Add polling for pending reviews badge to update in real-time
- Add refetchInterval to execution details query to poll while running/review
- Add polling support to usePendingReviewsForExecution hook
- Poll pending reviews every 2 seconds when execution is in REVIEW status
- This ensures the "X Reviews Pending" badge updates without page refresh
2026-01-21 21:08:10 -05:00
Zamil Majdy
bed7b356bb fix(frontend): Reset card data when auto-approve toggle changes
Include autoApproveFuture in the key prop to force PendingReviewCard
to remount when the toggle changes, which resets its internal state
to the original payload data.
2026-01-21 21:04:56 -05:00
Zamil Majdy
4efc0ff502 fix(migration): Correct migration to only drop FK constraint, not non-existent column
The nodeId column was never added to PendingHumanReview. The migration
should only drop the foreign key constraint linking nodeExecId to
AgentNodeExecution, not try to drop a column that doesn't exist.
2026-01-21 20:13:41 -05:00
Zamil Majdy
4ad0528257 feat(hitl): Simplify auto-approval with toggle UX and remove node_id storage
- Remove nodeId column from PendingHumanReview schema (use in-memory tracking)
- Remove foreign key relation from PendingHumanReview to AgentNodeExecution
- Use ExecutionContext.auto_approved_node_ids for auto-approval tracking
- Add auto-approve toggle in frontend (default off)
- When toggle enabled: disable editing and use original data
- Backend looks up agentNodeId from AgentNodeExecution when auto-approving
- Update tests to reflect schema changes
2026-01-21 19:57:11 -05:00
Zamil Majdy
2f440ee80a Merge branch 'dev' into feat/sensitive-action-features 2026-01-21 19:08:32 -05:00
Zamil Majdy
2a55923ec0 Merge dev to get GraphSettings fix 2026-01-21 09:31:17 -05:00
Zamil Majdy
ad50f57a2b chore: add migration for nodeId field in PendingHumanReview
Adds database migration to add the nodeId column which tracks
the node ID in the graph definition for auto-approval tracking.
2026-01-20 23:03:03 -05:00
Zamil Majdy
aebd961ef5 fix: implement node-specific auto-approval for human reviews
Instead of disabling all safe modes when approving all future actions,
now tracks specific node IDs that should be auto-approved. This means
clicking "Approve all future actions" will only auto-approve future
reviews from the same blocks, not all reviews.

Changes:
- Add nodeId field to PendingHumanReview schema
- Add auto_approved_node_ids set to ExecutionContext
- Update review helper to check auto_approved_node_ids
- Change API from disable_future_reviews to auto_approve_node_ids
- Update frontend to pass node_ids when bulk approving
- Address PR feedback: remove barrel file, JSDoc comments, and cleanup
2026-01-20 22:15:51 -05:00
Zamil Majdy
bcccaa16cc fix: remove unused props from AIAgentSafetyPopup component
Removes hasSensitiveAction and hasHumanInTheLoop props that were only
used by the hook, not the component itself, fixing ESLint unused vars error.
2026-01-20 21:05:39 -05:00
Zamil Majdy
d5ddc41b18 feat: add bulk approval option for human reviews
Add "Approve all future actions" button to the review UI that:
- Approves all current pending reviews
- Disables safe mode for the remainder of the execution run
- Shows helper text about turning auto-approval on/off in settings

Backend changes:
- Add disable_future_reviews flag to ReviewRequest model
- Pass ExecutionContext with disabled safe modes when resuming

Frontend changes:
- Add "Approve all future actions" button to PendingReviewsList
- Include helper text per PRD requirements

Implements SECRT-1795
2026-01-20 20:45:50 -05:00
Zamil Majdy
95eab5b7eb feat: add one-time safety popup for AI-generated agent runs
Show a one-time safety popup the first time a user runs an agent with
sensitive actions or human-in-the-loop blocks. The popup explains that
agents may take real-world actions and that safety checks are enabled.

- Add AI_AGENT_SAFETY_POPUP_SHOWN localStorage key
- Create AIAgentSafetyPopup component with hook
- Integrate popup into RunAgentModal before first run

Implements SECRT-1798
2026-01-20 20:40:18 -05:00
Zamil Majdy
832d6e1696 fix: correct safe mode checks for sensitive action blocks
- Add skip_safe_mode_check parameter to HITLReviewHelper to avoid
  checking the wrong safe mode flag for sensitive action blocks
- Simplify SafeModeToggle and FloatingSafeModeToggle by removing
  unnecessary intermediate variables and isHITLStateUndetermined checks
2026-01-20 20:33:55 -05:00
160 changed files with 2627 additions and 6746 deletions

View File

@@ -16,32 +16,6 @@ See `docs/content/platform/getting-started.md` for setup instructions.
- Format Python code with `poetry run format`. - Format Python code with `poetry run format`.
- Format frontend code using `pnpm format`. - Format frontend code using `pnpm format`.
## Frontend guidelines:
See `/frontend/CONTRIBUTING.md` for complete patterns. Quick reference:
1. **Pages**: Create in `src/app/(platform)/feature-name/page.tsx`
- Add `usePageName.ts` hook for logic
- Put sub-components in local `components/` folder
2. **Components**: Structure as `ComponentName/ComponentName.tsx` + `useComponentName.ts` + `helpers.ts`
- Use design system components from `src/components/` (atoms, molecules, organisms)
- Never use `src/components/__legacy__/*`
3. **Data fetching**: Use generated API hooks from `@/app/api/__generated__/endpoints/`
- Regenerate with `pnpm generate:api`
- Pattern: `use{Method}{Version}{OperationName}`
4. **Styling**: Tailwind CSS only, use design tokens, Phosphor Icons only
5. **Testing**: Add Storybook stories for new components, Playwright for E2E
6. **Code conventions**: Function declarations (not arrow functions) for components/handlers
- Component props should be `interface Props { ... }` (not exported) unless the interface needs to be used outside the component
- Separate render logic from business logic (component.tsx + useComponent.ts + helpers.ts)
- Colocate state when possible and avoid creating large components, use sub-components ( local `/components` folder next to the parent component ) when sensible
- Avoid large hooks, abstract logic into `helpers.ts` files when sensible
- Use function declarations for components, arrow functions only for callbacks
- No barrel files or `index.ts` re-exports
- Do not use `useCallback` or `useMemo` unless strictly needed
- Avoid comments at all times unless the code is very complex
## Testing ## Testing
- Backend: `poetry run test` (runs pytest with a docker based postgres + prisma). - Backend: `poetry run test` (runs pytest with a docker based postgres + prisma).

View File

@@ -201,7 +201,7 @@ If you get any pushback or hit complex block conditions check the new_blocks gui
3. Write tests alongside the route file 3. Write tests alongside the route file
4. Run `poetry run test` to verify 4. Run `poetry run test` to verify
### Frontend guidelines: **Frontend feature development:**
See `/frontend/CONTRIBUTING.md` for complete patterns. Quick reference: See `/frontend/CONTRIBUTING.md` for complete patterns. Quick reference:
@@ -217,14 +217,6 @@ See `/frontend/CONTRIBUTING.md` for complete patterns. Quick reference:
4. **Styling**: Tailwind CSS only, use design tokens, Phosphor Icons only 4. **Styling**: Tailwind CSS only, use design tokens, Phosphor Icons only
5. **Testing**: Add Storybook stories for new components, Playwright for E2E 5. **Testing**: Add Storybook stories for new components, Playwright for E2E
6. **Code conventions**: Function declarations (not arrow functions) for components/handlers 6. **Code conventions**: Function declarations (not arrow functions) for components/handlers
- Component props should be `interface Props { ... }` (not exported) unless the interface needs to be used outside the component
- Separate render logic from business logic (component.tsx + useComponent.ts + helpers.ts)
- Colocate state when possible and avoid creating large components, use sub-components ( local `/components` folder next to the parent component ) when sensible
- Avoid large hooks, abstract logic into `helpers.ts` files when sensible
- Use function declarations for components, arrow functions only for callbacks
- No barrel files or `index.ts` re-exports
- Do not use `useCallback` or `useMemo` unless strictly needed
- Avoid comments at all times unless the code is very complex
### Security Implementation ### Security Implementation

View File

@@ -290,11 +290,6 @@ async def _cache_session(session: ChatSession) -> None:
await async_redis.setex(redis_key, config.session_ttl, session.model_dump_json()) await async_redis.setex(redis_key, config.session_ttl, session.model_dump_json())
async def cache_chat_session(session: ChatSession) -> None:
"""Cache a chat session without persisting to the database."""
await _cache_session(session)
async def _get_session_from_db(session_id: str) -> ChatSession | None: async def _get_session_from_db(session_id: str) -> ChatSession | None:
"""Get a chat session from the database.""" """Get a chat session from the database."""
prisma_session = await chat_db.get_chat_session(session_id) prisma_session = await chat_db.get_chat_session(session_id)

View File

@@ -172,12 +172,12 @@ async def get_session(
user_id: The optional authenticated user ID, or None for anonymous access. user_id: The optional authenticated user ID, or None for anonymous access.
Returns: Returns:
SessionDetailResponse: Details for the requested session, or None if not found. SessionDetailResponse: Details for the requested session; raises NotFoundError if not found.
""" """
session = await get_chat_session(session_id, user_id) session = await get_chat_session(session_id, user_id)
if not session: if not session:
raise NotFoundError(f"Session {session_id} not found.") raise NotFoundError(f"Session {session_id} not found")
messages = [message.model_dump() for message in session.messages] messages = [message.model_dump() for message in session.messages]
logger.info( logger.info(
@@ -222,8 +222,6 @@ async def stream_chat_post(
session = await _validate_and_get_session(session_id, user_id) session = await _validate_and_get_session(session_id, user_id)
async def event_generator() -> AsyncGenerator[str, None]: async def event_generator() -> AsyncGenerator[str, None]:
chunk_count = 0
first_chunk_type: str | None = None
async for chunk in chat_service.stream_chat_completion( async for chunk in chat_service.stream_chat_completion(
session_id, session_id,
request.message, request.message,
@@ -232,26 +230,7 @@ async def stream_chat_post(
session=session, # Pass pre-fetched session to avoid double-fetch session=session, # Pass pre-fetched session to avoid double-fetch
context=request.context, context=request.context,
): ):
if chunk_count < 3:
logger.info(
"Chat stream chunk",
extra={
"session_id": session_id,
"chunk_type": str(chunk.type),
},
)
if not first_chunk_type:
first_chunk_type = str(chunk.type)
chunk_count += 1
yield chunk.to_sse() yield chunk.to_sse()
logger.info(
"Chat stream completed",
extra={
"session_id": session_id,
"chunk_count": chunk_count,
"first_chunk_type": first_chunk_type,
},
)
# AI SDK protocol termination # AI SDK protocol termination
yield "data: [DONE]\n\n" yield "data: [DONE]\n\n"
@@ -296,8 +275,6 @@ async def stream_chat_get(
session = await _validate_and_get_session(session_id, user_id) session = await _validate_and_get_session(session_id, user_id)
async def event_generator() -> AsyncGenerator[str, None]: async def event_generator() -> AsyncGenerator[str, None]:
chunk_count = 0
first_chunk_type: str | None = None
async for chunk in chat_service.stream_chat_completion( async for chunk in chat_service.stream_chat_completion(
session_id, session_id,
message, message,
@@ -305,26 +282,7 @@ async def stream_chat_get(
user_id=user_id, user_id=user_id,
session=session, # Pass pre-fetched session to avoid double-fetch session=session, # Pass pre-fetched session to avoid double-fetch
): ):
if chunk_count < 3:
logger.info(
"Chat stream chunk",
extra={
"session_id": session_id,
"chunk_type": str(chunk.type),
},
)
if not first_chunk_type:
first_chunk_type = str(chunk.type)
chunk_count += 1
yield chunk.to_sse() yield chunk.to_sse()
logger.info(
"Chat stream completed",
extra={
"session_id": session_id,
"chunk_count": chunk_count,
"first_chunk_type": first_chunk_type,
},
)
# AI SDK protocol termination # AI SDK protocol termination
yield "data: [DONE]\n\n" yield "data: [DONE]\n\n"

View File

@@ -1,20 +1,12 @@
import asyncio import asyncio
import logging import logging
import time
from asyncio import CancelledError
from collections.abc import AsyncGenerator from collections.abc import AsyncGenerator
from typing import Any from typing import Any
import orjson import orjson
from langfuse import get_client, propagate_attributes from langfuse import get_client, propagate_attributes
from langfuse.openai import openai # type: ignore from langfuse.openai import openai # type: ignore
from openai import ( from openai import APIConnectionError, APIError, APIStatusError, RateLimitError
APIConnectionError,
APIError,
APIStatusError,
PermissionDeniedError,
RateLimitError,
)
from openai.types.chat import ChatCompletionChunk, ChatCompletionToolParam from openai.types.chat import ChatCompletionChunk, ChatCompletionToolParam
from backend.data.understanding import ( from backend.data.understanding import (
@@ -29,7 +21,6 @@ from .model import (
ChatMessage, ChatMessage,
ChatSession, ChatSession,
Usage, Usage,
cache_chat_session,
get_chat_session, get_chat_session,
update_session_title, update_session_title,
upsert_chat_session, upsert_chat_session,
@@ -305,10 +296,6 @@ async def stream_chat_completion(
content="", content="",
) )
accumulated_tool_calls: list[dict[str, Any]] = [] accumulated_tool_calls: list[dict[str, Any]] = []
has_saved_assistant_message = False
has_appended_streaming_message = False
last_cache_time = 0.0
last_cache_content_len = 0
# Wrap main logic in try/finally to ensure Langfuse observations are always ended # Wrap main logic in try/finally to ensure Langfuse observations are always ended
has_yielded_end = False has_yielded_end = False
@@ -345,23 +332,6 @@ async def stream_chat_completion(
assert assistant_response.content is not None assert assistant_response.content is not None
assistant_response.content += delta assistant_response.content += delta
has_received_text = True has_received_text = True
if not has_appended_streaming_message:
session.messages.append(assistant_response)
has_appended_streaming_message = True
current_time = time.monotonic()
content_len = len(assistant_response.content)
if (
current_time - last_cache_time >= 1.0
and content_len > last_cache_content_len
):
try:
await cache_chat_session(session)
except Exception as e:
logger.warning(
f"Failed to cache partial session {session.session_id}: {e}"
)
last_cache_time = current_time
last_cache_content_len = content_len
yield chunk yield chunk
elif isinstance(chunk, StreamTextEnd): elif isinstance(chunk, StreamTextEnd):
# Emit text-end after text completes # Emit text-end after text completes
@@ -420,42 +390,10 @@ async def stream_chat_completion(
if has_received_text and not text_streaming_ended: if has_received_text and not text_streaming_ended:
yield StreamTextEnd(id=text_block_id) yield StreamTextEnd(id=text_block_id)
text_streaming_ended = True text_streaming_ended = True
# Save assistant message before yielding finish to ensure it's persisted
# even if client disconnects immediately after receiving StreamFinish
if not has_saved_assistant_message:
messages_to_save_early: list[ChatMessage] = []
if accumulated_tool_calls:
assistant_response.tool_calls = (
accumulated_tool_calls
)
if not has_appended_streaming_message and (
assistant_response.content
or assistant_response.tool_calls
):
messages_to_save_early.append(assistant_response)
messages_to_save_early.extend(tool_response_messages)
if messages_to_save_early:
session.messages.extend(messages_to_save_early)
logger.info(
f"Saving assistant message before StreamFinish: "
f"content_len={len(assistant_response.content or '')}, "
f"tool_calls={len(assistant_response.tool_calls or [])}, "
f"tool_responses={len(tool_response_messages)}"
)
if (
messages_to_save_early
or has_appended_streaming_message
):
await upsert_chat_session(session)
has_saved_assistant_message = True
has_yielded_end = True has_yielded_end = True
yield chunk yield chunk
elif isinstance(chunk, StreamError): elif isinstance(chunk, StreamError):
has_yielded_error = True has_yielded_error = True
yield chunk
elif isinstance(chunk, StreamUsage): elif isinstance(chunk, StreamUsage):
session.usage.append( session.usage.append(
Usage( Usage(
@@ -475,27 +413,6 @@ async def stream_chat_completion(
langfuse.update_current_trace(output=str(tool_response_messages)) langfuse.update_current_trace(output=str(tool_response_messages))
langfuse.update_current_span(output=str(tool_response_messages)) langfuse.update_current_span(output=str(tool_response_messages))
except CancelledError:
if not has_saved_assistant_message:
if accumulated_tool_calls:
assistant_response.tool_calls = accumulated_tool_calls
if assistant_response.content:
assistant_response.content = (
f"{assistant_response.content}\n\n[interrupted]"
)
else:
assistant_response.content = "[interrupted]"
if not has_appended_streaming_message:
session.messages.append(assistant_response)
if tool_response_messages:
session.messages.extend(tool_response_messages)
try:
await upsert_chat_session(session)
except Exception as e:
logger.warning(
f"Failed to save interrupted session {session.session_id}: {e}"
)
raise
except Exception as e: except Exception as e:
logger.error(f"Error during stream: {e!s}", exc_info=True) logger.error(f"Error during stream: {e!s}", exc_info=True)
@@ -517,19 +434,14 @@ async def stream_chat_completion(
# Add assistant message if it has content or tool calls # Add assistant message if it has content or tool calls
if accumulated_tool_calls: if accumulated_tool_calls:
assistant_response.tool_calls = accumulated_tool_calls assistant_response.tool_calls = accumulated_tool_calls
if not has_appended_streaming_message and ( if assistant_response.content or assistant_response.tool_calls:
assistant_response.content or assistant_response.tool_calls
):
messages_to_save.append(assistant_response) messages_to_save.append(assistant_response)
# Add tool response messages after assistant message # Add tool response messages after assistant message
messages_to_save.extend(tool_response_messages) messages_to_save.extend(tool_response_messages)
if not has_saved_assistant_message: session.messages.extend(messages_to_save)
if messages_to_save: await upsert_chat_session(session)
session.messages.extend(messages_to_save)
if messages_to_save or has_appended_streaming_message:
await upsert_chat_session(session)
if not has_yielded_error: if not has_yielded_error:
error_message = str(e) error_message = str(e)
@@ -560,49 +472,38 @@ async def stream_chat_completion(
return # Exit after retry to avoid double-saving in finally block return # Exit after retry to avoid double-saving in finally block
# Normal completion path - save session and handle tool call continuation # Normal completion path - save session and handle tool call continuation
# Only save if we haven't already saved when StreamFinish was received logger.info(
if not has_saved_assistant_message: f"Normal completion path: session={session.session_id}, "
f"current message_count={len(session.messages)}"
)
# Build the messages list in the correct order
messages_to_save: list[ChatMessage] = []
# Add assistant message with tool_calls if any
if accumulated_tool_calls:
assistant_response.tool_calls = accumulated_tool_calls
logger.info( logger.info(
f"Normal completion path: session={session.session_id}, " f"Added {len(accumulated_tool_calls)} tool calls to assistant message"
f"current message_count={len(session.messages)}" )
if assistant_response.content or assistant_response.tool_calls:
messages_to_save.append(assistant_response)
logger.info(
f"Saving assistant message with content_len={len(assistant_response.content or '')}, tool_calls={len(assistant_response.tool_calls or [])}"
) )
# Build the messages list in the correct order # Add tool response messages after assistant message
messages_to_save: list[ChatMessage] = [] messages_to_save.extend(tool_response_messages)
logger.info(
f"Saving {len(tool_response_messages)} tool response messages, "
f"total_to_save={len(messages_to_save)}"
)
# Add assistant message with tool_calls if any session.messages.extend(messages_to_save)
if accumulated_tool_calls: logger.info(
assistant_response.tool_calls = accumulated_tool_calls f"Extended session messages, new message_count={len(session.messages)}"
logger.info( )
f"Added {len(accumulated_tool_calls)} tool calls to assistant message" await upsert_chat_session(session)
)
if not has_appended_streaming_message and (
assistant_response.content or assistant_response.tool_calls
):
messages_to_save.append(assistant_response)
logger.info(
f"Saving assistant message with content_len={len(assistant_response.content or '')}, tool_calls={len(assistant_response.tool_calls or [])}"
)
# Add tool response messages after assistant message
messages_to_save.extend(tool_response_messages)
logger.info(
f"Saving {len(tool_response_messages)} tool response messages, "
f"total_to_save={len(messages_to_save)}"
)
if messages_to_save:
session.messages.extend(messages_to_save)
logger.info(
f"Extended session messages, new message_count={len(session.messages)}"
)
if messages_to_save or has_appended_streaming_message:
await upsert_chat_session(session)
else:
logger.info(
"Assistant message already saved when StreamFinish was received, "
"skipping duplicate save"
)
# If we did a tool call, stream the chat completion again to get the next response # If we did a tool call, stream the chat completion again to get the next response
if has_done_tool_call: if has_done_tool_call:
@@ -644,12 +545,6 @@ def _is_retryable_error(error: Exception) -> bool:
return False return False
def _is_region_blocked_error(error: Exception) -> bool:
if isinstance(error, PermissionDeniedError):
return "not available in your region" in str(error).lower()
return "not available in your region" in str(error).lower()
async def _stream_chat_chunks( async def _stream_chat_chunks(
session: ChatSession, session: ChatSession,
tools: list[ChatCompletionToolParam], tools: list[ChatCompletionToolParam],
@@ -842,18 +737,7 @@ async def _stream_chat_chunks(
f"Error in stream (not retrying): {e!s}", f"Error in stream (not retrying): {e!s}",
exc_info=True, exc_info=True,
) )
error_code = None error_response = StreamError(errorText=str(e))
error_text = str(e)
if _is_region_blocked_error(e):
error_code = "MODEL_NOT_AVAILABLE_REGION"
error_text = (
"This model is not available in your region. "
"Please connect via VPN and try again."
)
error_response = StreamError(
errorText=error_text,
code=error_code,
)
yield error_response yield error_response
yield StreamFinish() yield StreamFinish()
return return

View File

@@ -179,6 +179,14 @@ class ReviewRequest(BaseModel):
reviews: List[ReviewItem] = Field( reviews: List[ReviewItem] = Field(
description="All reviews with their approval status, data, and messages" description="All reviews with their approval status, data, and messages"
) )
auto_approve_future_actions: bool = Field(
default=False,
description=(
"If true, future reviews from the same blocks (nodes) being approved "
"will be automatically approved for the remainder of this execution. "
"This only affects the current execution run."
),
)
@model_validator(mode="after") @model_validator(mode="after")
def validate_review_completeness(self): def validate_review_completeness(self):

View File

@@ -490,3 +490,321 @@ def test_process_review_action_invalid_node_exec_id(
# Should be a 400 Bad Request, not 500 Internal Server Error # Should be a 400 Bad Request, not 500 Internal Server Error
assert response.status_code == 400 assert response.status_code == 400
assert "Invalid node execution ID format" in response.json()["detail"] assert "Invalid node execution ID format" in response.json()["detail"]
def test_process_review_action_auto_approve_creates_auto_approval_records(
mocker: pytest_mock.MockerFixture,
sample_pending_review: PendingHumanReviewModel,
test_user_id: str,
) -> None:
"""Test that auto_approve_future_actions flag creates auto-approval records"""
from backend.data.execution import ExecutionContext, NodeExecutionResult
from backend.data.graph import GraphSettings
# Mock process_all_reviews
mock_process_all_reviews = mocker.patch(
"backend.api.features.executions.review.routes.process_all_reviews_for_execution"
)
approved_review = PendingHumanReviewModel(
node_exec_id="test_node_123",
user_id=test_user_id,
graph_exec_id="test_graph_exec_456",
graph_id="test_graph_789",
graph_version=1,
payload={"data": "test payload"},
instructions="Please review",
editable=True,
status=ReviewStatus.APPROVED,
review_message="Approved",
was_edited=False,
processed=False,
created_at=FIXED_NOW,
updated_at=FIXED_NOW,
reviewed_at=FIXED_NOW,
)
mock_process_all_reviews.return_value = {"test_node_123": approved_review}
# Mock get_node_execution to return node_id
mock_get_node_execution = mocker.patch(
"backend.api.features.executions.review.routes.get_node_execution"
)
mock_node_exec = mocker.Mock(spec=NodeExecutionResult)
mock_node_exec.node_id = "test_node_def_456"
mock_get_node_execution.return_value = mock_node_exec
# Mock create_auto_approval_record
mock_create_auto_approval = mocker.patch(
"backend.api.features.executions.review.routes.create_auto_approval_record"
)
# Mock has_pending_reviews_for_graph_exec
mock_has_pending = mocker.patch(
"backend.api.features.executions.review.routes.has_pending_reviews_for_graph_exec"
)
mock_has_pending.return_value = False
# Mock get_graph_settings to return custom settings
mock_get_settings = mocker.patch(
"backend.api.features.executions.review.routes.get_graph_settings"
)
mock_get_settings.return_value = GraphSettings(
human_in_the_loop_safe_mode=True,
sensitive_action_safe_mode=True,
)
# Mock add_graph_execution
mock_add_execution = mocker.patch(
"backend.api.features.executions.review.routes.add_graph_execution"
)
request_data = {
"reviews": [
{
"node_exec_id": "test_node_123",
"approved": True,
"message": "Approved",
}
],
"auto_approve_future_actions": True,
}
response = client.post("/api/review/action", json=request_data)
assert response.status_code == 200
# Verify process_all_reviews_for_execution was called (without auto_approve param)
mock_process_all_reviews.assert_called_once()
# Verify create_auto_approval_record was called for the approved review
mock_create_auto_approval.assert_called_once_with(
user_id=test_user_id,
graph_exec_id="test_graph_exec_456",
graph_id="test_graph_789",
graph_version=1,
node_id="test_node_def_456",
payload={"data": "test payload"},
)
# Verify get_graph_settings was called with correct parameters
mock_get_settings.assert_called_once_with(
user_id=test_user_id, graph_id="test_graph_789"
)
# Verify add_graph_execution was called with proper ExecutionContext
mock_add_execution.assert_called_once()
call_kwargs = mock_add_execution.call_args.kwargs
execution_context = call_kwargs["execution_context"]
assert isinstance(execution_context, ExecutionContext)
assert execution_context.human_in_the_loop_safe_mode is True
assert execution_context.sensitive_action_safe_mode is True
def test_process_review_action_without_auto_approve_still_loads_settings(
mocker: pytest_mock.MockerFixture,
sample_pending_review: PendingHumanReviewModel,
test_user_id: str,
) -> None:
"""Test that execution context is created with settings even without auto-approve"""
from backend.data.execution import ExecutionContext
from backend.data.graph import GraphSettings
# Mock process_all_reviews
mock_process_all_reviews = mocker.patch(
"backend.api.features.executions.review.routes.process_all_reviews_for_execution"
)
approved_review = PendingHumanReviewModel(
node_exec_id="test_node_123",
user_id=test_user_id,
graph_exec_id="test_graph_exec_456",
graph_id="test_graph_789",
graph_version=1,
payload={"data": "test payload"},
instructions="Please review",
editable=True,
status=ReviewStatus.APPROVED,
review_message="Approved",
was_edited=False,
processed=False,
created_at=FIXED_NOW,
updated_at=FIXED_NOW,
reviewed_at=FIXED_NOW,
)
mock_process_all_reviews.return_value = {"test_node_123": approved_review}
# Mock create_auto_approval_record - should NOT be called when auto_approve is False
mock_create_auto_approval = mocker.patch(
"backend.api.features.executions.review.routes.create_auto_approval_record"
)
# Mock has_pending_reviews_for_graph_exec
mock_has_pending = mocker.patch(
"backend.api.features.executions.review.routes.has_pending_reviews_for_graph_exec"
)
mock_has_pending.return_value = False
# Mock get_graph_settings with sensitive_action_safe_mode enabled
mock_get_settings = mocker.patch(
"backend.api.features.executions.review.routes.get_graph_settings"
)
mock_get_settings.return_value = GraphSettings(
human_in_the_loop_safe_mode=False,
sensitive_action_safe_mode=True,
)
# Mock add_graph_execution
mock_add_execution = mocker.patch(
"backend.api.features.executions.review.routes.add_graph_execution"
)
# Request WITHOUT auto_approve_future_actions
request_data = {
"reviews": [
{
"node_exec_id": "test_node_123",
"approved": True,
"message": "Approved",
}
],
"auto_approve_future_actions": False,
}
response = client.post("/api/review/action", json=request_data)
assert response.status_code == 200
# Verify process_all_reviews_for_execution was called
mock_process_all_reviews.assert_called_once()
# Verify create_auto_approval_record was NOT called (auto_approve_future_actions=False)
mock_create_auto_approval.assert_not_called()
# Verify settings were loaded
mock_get_settings.assert_called_once()
# Verify ExecutionContext has proper settings
mock_add_execution.assert_called_once()
call_kwargs = mock_add_execution.call_args.kwargs
execution_context = call_kwargs["execution_context"]
assert isinstance(execution_context, ExecutionContext)
assert execution_context.human_in_the_loop_safe_mode is False
assert execution_context.sensitive_action_safe_mode is True
def test_process_review_action_auto_approve_only_applies_to_approved_reviews(
mocker: pytest_mock.MockerFixture,
test_user_id: str,
) -> None:
"""Test that auto_approve record is created only for approved reviews"""
from backend.data.execution import ExecutionContext, NodeExecutionResult
from backend.data.graph import GraphSettings
# Create two reviews - one approved, one rejected
approved_review = PendingHumanReviewModel(
node_exec_id="node_exec_approved",
user_id=test_user_id,
graph_exec_id="test_graph_exec_456",
graph_id="test_graph_789",
graph_version=1,
payload={"data": "approved"},
instructions="Review",
editable=True,
status=ReviewStatus.APPROVED,
review_message=None,
was_edited=False,
processed=False,
created_at=FIXED_NOW,
updated_at=FIXED_NOW,
reviewed_at=FIXED_NOW,
)
rejected_review = PendingHumanReviewModel(
node_exec_id="node_exec_rejected",
user_id=test_user_id,
graph_exec_id="test_graph_exec_456",
graph_id="test_graph_789",
graph_version=1,
payload={"data": "rejected"},
instructions="Review",
editable=True,
status=ReviewStatus.REJECTED,
review_message="Rejected",
was_edited=False,
processed=False,
created_at=FIXED_NOW,
updated_at=FIXED_NOW,
reviewed_at=FIXED_NOW,
)
# Mock process_all_reviews
mock_process_all_reviews = mocker.patch(
"backend.api.features.executions.review.routes.process_all_reviews_for_execution"
)
mock_process_all_reviews.return_value = {
"node_exec_approved": approved_review,
"node_exec_rejected": rejected_review,
}
# Mock get_node_execution to return node_id (only called for approved review)
mock_get_node_execution = mocker.patch(
"backend.api.features.executions.review.routes.get_node_execution"
)
mock_node_exec = mocker.Mock(spec=NodeExecutionResult)
mock_node_exec.node_id = "test_node_def_approved"
mock_get_node_execution.return_value = mock_node_exec
# Mock create_auto_approval_record
mock_create_auto_approval = mocker.patch(
"backend.api.features.executions.review.routes.create_auto_approval_record"
)
# Mock has_pending_reviews_for_graph_exec
mock_has_pending = mocker.patch(
"backend.api.features.executions.review.routes.has_pending_reviews_for_graph_exec"
)
mock_has_pending.return_value = False
# Mock get_graph_settings
mock_get_settings = mocker.patch(
"backend.api.features.executions.review.routes.get_graph_settings"
)
mock_get_settings.return_value = GraphSettings()
# Mock add_graph_execution
mock_add_execution = mocker.patch(
"backend.api.features.executions.review.routes.add_graph_execution"
)
request_data = {
"reviews": [
{"node_exec_id": "node_exec_approved", "approved": True},
{"node_exec_id": "node_exec_rejected", "approved": False},
],
"auto_approve_future_actions": True,
}
response = client.post("/api/review/action", json=request_data)
assert response.status_code == 200
# Verify process_all_reviews_for_execution was called
mock_process_all_reviews.assert_called_once()
# Verify create_auto_approval_record was called ONLY for the approved review
# (not for the rejected one)
mock_create_auto_approval.assert_called_once_with(
user_id=test_user_id,
graph_exec_id="test_graph_exec_456",
graph_id="test_graph_789",
graph_version=1,
node_id="test_node_def_approved",
payload={"data": "approved"},
)
# Verify get_node_execution was called only for approved review
mock_get_node_execution.assert_called_once_with("node_exec_approved")
# Verify ExecutionContext was created (auto-approval is now DB-based)
call_kwargs = mock_add_execution.call_args.kwargs
execution_context = call_kwargs["execution_context"]
assert isinstance(execution_context, ExecutionContext)

View File

@@ -5,8 +5,14 @@ import autogpt_libs.auth as autogpt_auth_lib
from fastapi import APIRouter, HTTPException, Query, Security, status from fastapi import APIRouter, HTTPException, Query, Security, status
from prisma.enums import ReviewStatus from prisma.enums import ReviewStatus
from backend.data.execution import get_graph_execution_meta from backend.data.execution import (
ExecutionContext,
get_graph_execution_meta,
get_node_execution,
)
from backend.data.graph import get_graph_settings
from backend.data.human_review import ( from backend.data.human_review import (
create_auto_approval_record,
get_pending_reviews_for_execution, get_pending_reviews_for_execution,
get_pending_reviews_for_user, get_pending_reviews_for_user,
has_pending_reviews_for_graph_exec, has_pending_reviews_for_graph_exec,
@@ -128,14 +134,20 @@ async def process_review_action(
) )
# Build review decisions map # Build review decisions map
# When auto_approve_future_actions is true, ignore any edited data
# (auto-approved reviews should use original data for consistency)
review_decisions = {} review_decisions = {}
for review in request.reviews: for review in request.reviews:
review_status = ( review_status = (
ReviewStatus.APPROVED if review.approved else ReviewStatus.REJECTED ReviewStatus.APPROVED if review.approved else ReviewStatus.REJECTED
) )
# If auto-approving future actions, don't allow data modifications
reviewed_data = (
None if request.auto_approve_future_actions else review.reviewed_data
)
review_decisions[review.node_exec_id] = ( review_decisions[review.node_exec_id] = (
review_status, review_status,
review.reviewed_data, reviewed_data,
review.message, review.message,
) )
@@ -145,6 +157,22 @@ async def process_review_action(
review_decisions=review_decisions, review_decisions=review_decisions,
) )
# Create auto-approval records for approved reviews if requested
if request.auto_approve_future_actions:
for node_exec_id, review in updated_reviews.items():
if review.status == ReviewStatus.APPROVED:
# Look up the node_id from the node execution
node_exec = await get_node_execution(node_exec_id)
if node_exec:
await create_auto_approval_record(
user_id=user_id,
graph_exec_id=review.graph_exec_id,
graph_id=review.graph_id,
graph_version=review.graph_version,
node_id=node_exec.node_id,
payload=review.payload,
)
# Count results # Count results
approved_count = sum( approved_count = sum(
1 1
@@ -169,10 +197,24 @@ async def process_review_action(
if not still_has_pending: if not still_has_pending:
# Resume execution # Resume execution
try: try:
# Load graph settings to create proper execution context
settings = await get_graph_settings(
user_id=user_id, graph_id=first_review.graph_id
)
# Create execution context with settings
# Note: auto-approval is now handled via database lookup in
# check_auto_approval(), no need to pass auto_approved_node_ids
execution_context = ExecutionContext(
human_in_the_loop_safe_mode=settings.human_in_the_loop_safe_mode,
sensitive_action_safe_mode=settings.sensitive_action_safe_mode,
)
await add_graph_execution( await add_graph_execution(
graph_id=first_review.graph_id, graph_id=first_review.graph_id,
user_id=user_id, user_id=user_id,
graph_exec_id=graph_exec_id, graph_exec_id=graph_exec_id,
execution_context=execution_context,
) )
logger.info(f"Resumed execution {graph_exec_id}") logger.info(f"Resumed execution {graph_exec_id}")
except Exception as e: except Exception as e:

View File

@@ -116,6 +116,7 @@ class PrintToConsoleBlock(Block):
input_schema=PrintToConsoleBlock.Input, input_schema=PrintToConsoleBlock.Input,
output_schema=PrintToConsoleBlock.Output, output_schema=PrintToConsoleBlock.Output,
test_input={"text": "Hello, World!"}, test_input={"text": "Hello, World!"},
is_sensitive_action=True,
test_output=[ test_output=[
("output", "Hello, World!"), ("output", "Hello, World!"),
("status", "printed"), ("status", "printed"),

View File

@@ -10,7 +10,7 @@ from prisma.enums import ReviewStatus
from pydantic import BaseModel from pydantic import BaseModel
from backend.data.execution import ExecutionContext, ExecutionStatus from backend.data.execution import ExecutionContext, ExecutionStatus
from backend.data.human_review import ReviewResult from backend.data.human_review import ReviewResult, check_auto_approval
from backend.executor.manager import async_update_node_execution_status from backend.executor.manager import async_update_node_execution_status
from backend.util.clients import get_database_manager_async_client from backend.util.clients import get_database_manager_async_client
@@ -55,6 +55,7 @@ class HITLReviewHelper:
async def _handle_review_request( async def _handle_review_request(
input_data: Any, input_data: Any,
user_id: str, user_id: str,
node_id: str,
node_exec_id: str, node_exec_id: str,
graph_exec_id: str, graph_exec_id: str,
graph_id: str, graph_id: str,
@@ -69,6 +70,7 @@ class HITLReviewHelper:
Args: Args:
input_data: The input data to be reviewed input_data: The input data to be reviewed
user_id: ID of the user requesting the review user_id: ID of the user requesting the review
node_id: ID of the node in the graph definition
node_exec_id: ID of the node execution node_exec_id: ID of the node execution
graph_exec_id: ID of the graph execution graph_exec_id: ID of the graph execution
graph_id: ID of the graph graph_id: ID of the graph
@@ -83,15 +85,27 @@ class HITLReviewHelper:
Raises: Raises:
Exception: If review creation or status update fails Exception: If review creation or status update fails
""" """
# Skip review if safe mode is disabled - return auto-approved result # Note: Safe mode checks (human_in_the_loop_safe_mode, sensitive_action_safe_mode)
if not execution_context.human_in_the_loop_safe_mode: # are handled by the caller:
# - HITL blocks check human_in_the_loop_safe_mode in their run() method
# - Sensitive action blocks check sensitive_action_safe_mode in is_block_exec_need_review()
# This function only handles auto-approval for specific nodes.
# Check if this node has been auto-approved in a previous review
auto_approval = await check_auto_approval(
graph_exec_id=graph_exec_id,
node_id=node_id,
)
if auto_approval:
logger.info( logger.info(
f"Block {block_name} skipping review for node {node_exec_id} - safe mode disabled" f"Block {block_name} skipping review for node {node_exec_id} - "
f"node {node_id} has auto-approval from previous review"
) )
# Return a new ReviewResult with the current node_exec_id but approved status
return ReviewResult( return ReviewResult(
data=input_data, data=input_data,
status=ReviewStatus.APPROVED, status=ReviewStatus.APPROVED,
message="Auto-approved (safe mode disabled)", message="Auto-approved (user approved all future actions for this block)",
processed=True, processed=True,
node_exec_id=node_exec_id, node_exec_id=node_exec_id,
) )
@@ -129,6 +143,7 @@ class HITLReviewHelper:
async def handle_review_decision( async def handle_review_decision(
input_data: Any, input_data: Any,
user_id: str, user_id: str,
node_id: str,
node_exec_id: str, node_exec_id: str,
graph_exec_id: str, graph_exec_id: str,
graph_id: str, graph_id: str,
@@ -143,6 +158,7 @@ class HITLReviewHelper:
Args: Args:
input_data: The input data to be reviewed input_data: The input data to be reviewed
user_id: ID of the user requesting the review user_id: ID of the user requesting the review
node_id: ID of the node in the graph definition
node_exec_id: ID of the node execution node_exec_id: ID of the node execution
graph_exec_id: ID of the graph execution graph_exec_id: ID of the graph execution
graph_id: ID of the graph graph_id: ID of the graph
@@ -158,6 +174,7 @@ class HITLReviewHelper:
review_result = await HITLReviewHelper._handle_review_request( review_result = await HITLReviewHelper._handle_review_request(
input_data=input_data, input_data=input_data,
user_id=user_id, user_id=user_id,
node_id=node_id,
node_exec_id=node_exec_id, node_exec_id=node_exec_id,
graph_exec_id=graph_exec_id, graph_exec_id=graph_exec_id,
graph_id=graph_id, graph_id=graph_id,

View File

@@ -97,6 +97,7 @@ class HumanInTheLoopBlock(Block):
input_data: Input, input_data: Input,
*, *,
user_id: str, user_id: str,
node_id: str,
node_exec_id: str, node_exec_id: str,
graph_exec_id: str, graph_exec_id: str,
graph_id: str, graph_id: str,
@@ -115,6 +116,7 @@ class HumanInTheLoopBlock(Block):
decision = await self.handle_review_decision( decision = await self.handle_review_decision(
input_data=input_data.data, input_data=input_data.data,
user_id=user_id, user_id=user_id,
node_id=node_id,
node_exec_id=node_exec_id, node_exec_id=node_exec_id,
graph_exec_id=graph_exec_id, graph_exec_id=graph_exec_id,
graph_id=graph_id, graph_id=graph_id,

View File

@@ -441,6 +441,7 @@ class Block(ABC, Generic[BlockSchemaInputType, BlockSchemaOutputType]):
static_output: bool = False, static_output: bool = False,
block_type: BlockType = BlockType.STANDARD, block_type: BlockType = BlockType.STANDARD,
webhook_config: Optional[BlockWebhookConfig | BlockManualWebhookConfig] = None, webhook_config: Optional[BlockWebhookConfig | BlockManualWebhookConfig] = None,
is_sensitive_action: bool = False,
): ):
""" """
Initialize the block with the given schema. Initialize the block with the given schema.
@@ -473,8 +474,8 @@ class Block(ABC, Generic[BlockSchemaInputType, BlockSchemaOutputType]):
self.static_output = static_output self.static_output = static_output
self.block_type = block_type self.block_type = block_type
self.webhook_config = webhook_config self.webhook_config = webhook_config
self.is_sensitive_action = is_sensitive_action
self.execution_stats: NodeExecutionStats = NodeExecutionStats() self.execution_stats: NodeExecutionStats = NodeExecutionStats()
self.is_sensitive_action: bool = False
if self.webhook_config: if self.webhook_config:
if isinstance(self.webhook_config, BlockWebhookConfig): if isinstance(self.webhook_config, BlockWebhookConfig):
@@ -622,6 +623,7 @@ class Block(ABC, Generic[BlockSchemaInputType, BlockSchemaOutputType]):
input_data: BlockInput, input_data: BlockInput,
*, *,
user_id: str, user_id: str,
node_id: str,
node_exec_id: str, node_exec_id: str,
graph_exec_id: str, graph_exec_id: str,
graph_id: str, graph_id: str,
@@ -648,6 +650,7 @@ class Block(ABC, Generic[BlockSchemaInputType, BlockSchemaOutputType]):
decision = await HITLReviewHelper.handle_review_decision( decision = await HITLReviewHelper.handle_review_decision(
input_data=input_data, input_data=input_data,
user_id=user_id, user_id=user_id,
node_id=node_id,
node_exec_id=node_exec_id, node_exec_id=node_exec_id,
graph_exec_id=graph_exec_id, graph_exec_id=graph_exec_id,
graph_id=graph_id, graph_id=graph_id,

View File

@@ -32,6 +32,87 @@ class ReviewResult(BaseModel):
node_exec_id: str node_exec_id: str
def get_auto_approve_key(graph_exec_id: str, node_id: str) -> str:
"""Generate the special nodeExecId key for auto-approval records."""
return f"auto_approve_{graph_exec_id}_{node_id}"
async def check_auto_approval(
graph_exec_id: str,
node_id: str,
) -> Optional[ReviewResult]:
"""
Check if there's an existing auto-approval record for this node in this execution.
Auto-approval records are stored as PendingHumanReview entries with a special
nodeExecId pattern: "auto_approve_{graph_exec_id}_{node_id}"
Args:
graph_exec_id: ID of the graph execution
node_id: ID of the node definition (not execution)
Returns:
ReviewResult if auto-approval found, None otherwise
"""
auto_approve_key = get_auto_approve_key(graph_exec_id, node_id)
# Look for the auto-approval record by its special key
auto_approved_review = await PendingHumanReview.prisma().find_unique(
where={"nodeExecId": auto_approve_key},
)
if auto_approved_review and auto_approved_review.status == ReviewStatus.APPROVED:
logger.info(
f"Found auto-approval for node {node_id} in execution {graph_exec_id}"
)
return ReviewResult(
data=auto_approved_review.payload,
status=ReviewStatus.APPROVED,
message="Auto-approved (user approved all future actions for this block)",
processed=True,
node_exec_id=auto_approve_key,
)
return None
async def create_auto_approval_record(
user_id: str,
graph_exec_id: str,
graph_id: str,
graph_version: int,
node_id: str,
payload: SafeJsonData,
) -> None:
"""
Create an auto-approval record for a node in this execution.
This is stored as a PendingHumanReview with a special nodeExecId pattern
and status=APPROVED, so future executions of the same node can skip review.
"""
auto_approve_key = get_auto_approve_key(graph_exec_id, node_id)
await PendingHumanReview.prisma().upsert(
where={"nodeExecId": auto_approve_key},
data={
"create": {
"nodeExecId": auto_approve_key,
"userId": user_id,
"graphExecId": graph_exec_id,
"graphId": graph_id,
"graphVersion": graph_version,
"payload": SafeJson(payload),
"instructions": "Auto-approval record",
"editable": False,
"status": ReviewStatus.APPROVED,
"processed": True,
"reviewedAt": datetime.now(timezone.utc),
},
"update": {}, # Already exists, no update needed
},
)
async def get_or_create_human_review( async def get_or_create_human_review(
user_id: str, user_id: str,
node_exec_id: str, node_exec_id: str,

View File

@@ -46,8 +46,8 @@ async def test_get_or_create_human_review_new(
sample_db_review.status = ReviewStatus.WAITING sample_db_review.status = ReviewStatus.WAITING
sample_db_review.processed = False sample_db_review.processed = False
mock_upsert = mocker.patch("backend.data.human_review.PendingHumanReview.prisma") mock_prisma = mocker.patch("backend.data.human_review.PendingHumanReview.prisma")
mock_upsert.return_value.upsert = AsyncMock(return_value=sample_db_review) mock_prisma.return_value.upsert = AsyncMock(return_value=sample_db_review)
result = await get_or_create_human_review( result = await get_or_create_human_review(
user_id="test-user-123", user_id="test-user-123",
@@ -75,8 +75,8 @@ async def test_get_or_create_human_review_approved(
sample_db_review.processed = False sample_db_review.processed = False
sample_db_review.reviewMessage = "Looks good" sample_db_review.reviewMessage = "Looks good"
mock_upsert = mocker.patch("backend.data.human_review.PendingHumanReview.prisma") mock_prisma = mocker.patch("backend.data.human_review.PendingHumanReview.prisma")
mock_upsert.return_value.upsert = AsyncMock(return_value=sample_db_review) mock_prisma.return_value.upsert = AsyncMock(return_value=sample_db_review)
result = await get_or_create_human_review( result = await get_or_create_human_review(
user_id="test-user-123", user_id="test-user-123",

View File

@@ -0,0 +1,7 @@
-- Remove NodeExecution foreign key from PendingHumanReview
-- The nodeExecId column remains as the primary key, but we remove the FK constraint
-- to AgentNodeExecution since PendingHumanReview records can persist after node
-- execution records are deleted.
-- Drop foreign key constraint that linked PendingHumanReview.nodeExecId to AgentNodeExecution.id
ALTER TABLE "platform"."PendingHumanReview" DROP CONSTRAINT IF EXISTS "PendingHumanReview_nodeExecId_fkey";

View File

@@ -517,8 +517,6 @@ model AgentNodeExecution {
stats Json? stats Json?
PendingHumanReview PendingHumanReview?
@@index([agentGraphExecutionId, agentNodeId, executionStatus]) @@index([agentGraphExecutionId, agentNodeId, executionStatus])
@@index([agentNodeId, executionStatus]) @@index([agentNodeId, executionStatus])
@@index([addedTime, queuedTime]) @@index([addedTime, queuedTime])
@@ -567,6 +565,7 @@ enum ReviewStatus {
} }
// Pending human reviews for Human-in-the-loop blocks // Pending human reviews for Human-in-the-loop blocks
// Also stores auto-approval records with special nodeExecId patterns (e.g., "auto_approve_{graph_exec_id}_{node_id}")
model PendingHumanReview { model PendingHumanReview {
nodeExecId String @id nodeExecId String @id
userId String userId String
@@ -585,7 +584,6 @@ model PendingHumanReview {
reviewedAt DateTime? reviewedAt DateTime?
User User @relation(fields: [userId], references: [id], onDelete: Cascade) User User @relation(fields: [userId], references: [id], onDelete: Cascade)
NodeExecution AgentNodeExecution @relation(fields: [nodeExecId], references: [id], onDelete: Cascade)
GraphExecution AgentGraphExecution @relation(fields: [graphExecId], references: [id], onDelete: Cascade) GraphExecution AgentGraphExecution @relation(fields: [graphExecId], references: [id], onDelete: Cascade)
@@unique([nodeExecId]) // One pending review per node execution @@unique([nodeExecId]) // One pending review per node execution

View File

@@ -175,8 +175,6 @@ While server components and actions are cool and cutting-edge, they introduce a
- Prefer [React Query](https://tanstack.com/query/latest/docs/framework/react/overview) for server state, colocated near consumers (see [state colocation](https://kentcdodds.com/blog/state-colocation-will-make-your-react-app-faster)) - Prefer [React Query](https://tanstack.com/query/latest/docs/framework/react/overview) for server state, colocated near consumers (see [state colocation](https://kentcdodds.com/blog/state-colocation-will-make-your-react-app-faster))
- Co-locate UI state inside components/hooks; keep global state minimal - Co-locate UI state inside components/hooks; keep global state minimal
- Avoid `useMemo` and `useCallback` unless you have a measured performance issue
- Do not abuse `useEffect`; prefer state colocation and derive values directly when possible
### Styling and components ### Styling and components
@@ -551,48 +549,9 @@ Files:
Types: Types:
- Prefer `interface` for object shapes - Prefer `interface` for object shapes
- Component props should be `interface Props { ... }` (not exported) - Component props should be `interface Props { ... }`
- Only use specific exported names (e.g., `export interface MyComponentProps`) when the interface needs to be used outside the component
- Keep type definitions inline with the component - do not create separate `types.ts` files unless types are shared across multiple files
- Use precise types; avoid `any` and unsafe casts - Use precise types; avoid `any` and unsafe casts
**Props naming examples:**
```tsx
// ✅ Good - internal props, not exported
interface Props {
title: string;
onClose: () => void;
}
export function Modal({ title, onClose }: Props) {
// ...
}
// ✅ Good - exported when needed externally
export interface ModalProps {
title: string;
onClose: () => void;
}
export function Modal({ title, onClose }: ModalProps) {
// ...
}
// ❌ Bad - unnecessarily specific name for internal use
interface ModalComponentProps {
title: string;
onClose: () => void;
}
// ❌ Bad - separate types.ts file for single component
// types.ts
export interface ModalProps { ... }
// Modal.tsx
import type { ModalProps } from './types';
```
Parameters: Parameters:
- If more than one parameter is needed, pass a single `Args` object for clarity - If more than one parameter is needed, pass a single `Args` object for clarity

View File

@@ -16,12 +16,6 @@ export default defineConfig({
client: "react-query", client: "react-query",
httpClient: "fetch", httpClient: "fetch",
indexFiles: false, indexFiles: false,
mock: {
type: "msw",
baseUrl: "http://localhost:3000/api/proxy",
generateEachHttpStatus: true,
delay: 0,
},
override: { override: {
mutator: { mutator: {
path: "./mutators/custom-mutator.ts", path: "./mutators/custom-mutator.ts",

View File

@@ -15,8 +15,6 @@
"types": "tsc --noEmit", "types": "tsc --noEmit",
"test": "NEXT_PUBLIC_PW_TEST=true next build --turbo && playwright test", "test": "NEXT_PUBLIC_PW_TEST=true next build --turbo && playwright test",
"test-ui": "NEXT_PUBLIC_PW_TEST=true next build --turbo && playwright test --ui", "test-ui": "NEXT_PUBLIC_PW_TEST=true next build --turbo && playwright test --ui",
"test:unit": "vitest run",
"test:unit:watch": "vitest",
"test:no-build": "playwright test", "test:no-build": "playwright test",
"gentests": "playwright codegen http://localhost:3000", "gentests": "playwright codegen http://localhost:3000",
"storybook": "storybook dev -p 6006", "storybook": "storybook dev -p 6006",
@@ -120,7 +118,6 @@
}, },
"devDependencies": { "devDependencies": {
"@chromatic-com/storybook": "4.1.2", "@chromatic-com/storybook": "4.1.2",
"happy-dom": "20.3.4",
"@opentelemetry/instrumentation": "0.209.0", "@opentelemetry/instrumentation": "0.209.0",
"@playwright/test": "1.56.1", "@playwright/test": "1.56.1",
"@storybook/addon-a11y": "9.1.5", "@storybook/addon-a11y": "9.1.5",
@@ -130,8 +127,6 @@
"@storybook/nextjs": "9.1.5", "@storybook/nextjs": "9.1.5",
"@tanstack/eslint-plugin-query": "5.91.2", "@tanstack/eslint-plugin-query": "5.91.2",
"@tanstack/react-query-devtools": "5.90.2", "@tanstack/react-query-devtools": "5.90.2",
"@testing-library/dom": "10.4.1",
"@testing-library/react": "16.3.2",
"@types/canvas-confetti": "1.9.0", "@types/canvas-confetti": "1.9.0",
"@types/lodash": "4.17.20", "@types/lodash": "4.17.20",
"@types/negotiator": "0.6.4", "@types/negotiator": "0.6.4",
@@ -140,7 +135,6 @@
"@types/react-dom": "18.3.5", "@types/react-dom": "18.3.5",
"@types/react-modal": "3.16.3", "@types/react-modal": "3.16.3",
"@types/react-window": "1.8.8", "@types/react-window": "1.8.8",
"@vitejs/plugin-react": "5.1.2",
"axe-playwright": "2.2.2", "axe-playwright": "2.2.2",
"chromatic": "13.3.3", "chromatic": "13.3.3",
"concurrently": "9.2.1", "concurrently": "9.2.1",
@@ -159,9 +153,7 @@
"require-in-the-middle": "8.0.1", "require-in-the-middle": "8.0.1",
"storybook": "9.1.5", "storybook": "9.1.5",
"tailwindcss": "3.4.17", "tailwindcss": "3.4.17",
"typescript": "5.9.3", "typescript": "5.9.3"
"vite-tsconfig-paths": "6.0.4",
"vitest": "4.0.17"
}, },
"msw": { "msw": {
"workerDirectory": [ "workerDirectory": [

File diff suppressed because it is too large Load Diff

View File

@@ -1,58 +0,0 @@
"use client";
import { LoadingSpinner } from "@/components/atoms/LoadingSpinner/LoadingSpinner";
import { Text } from "@/components/atoms/Text/Text";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { useRouter } from "next/navigation";
import { useEffect, useRef } from "react";
const LOGOUT_REDIRECT_DELAY_MS = 400;
function wait(ms: number): Promise<void> {
return new Promise(function resolveAfterDelay(resolve) {
setTimeout(resolve, ms);
});
}
export default function LogoutPage() {
const { logOut } = useSupabase();
const { toast } = useToast();
const router = useRouter();
const hasStartedRef = useRef(false);
useEffect(
function handleLogoutEffect() {
if (hasStartedRef.current) return;
hasStartedRef.current = true;
async function runLogout() {
try {
await logOut();
} catch {
toast({
title: "Failed to log out. Redirecting to login.",
variant: "destructive",
});
} finally {
await wait(LOGOUT_REDIRECT_DELAY_MS);
router.replace("/login");
}
}
void runLogout();
},
[logOut, router, toast],
);
return (
<div className="flex min-h-screen items-center justify-center px-4">
<div className="flex flex-col items-center justify-center gap-4 py-8">
<LoadingSpinner size="large" />
<Text variant="body" className="text-center">
Logging you out...
</Text>
</div>
</div>
);
}

View File

@@ -9,7 +9,7 @@ export async function GET(request: Request) {
const { searchParams, origin } = new URL(request.url); const { searchParams, origin } = new URL(request.url);
const code = searchParams.get("code"); const code = searchParams.get("code");
let next = "/"; let next = "/marketplace";
if (code) { if (code) {
const supabase = await getServerSupabase(); const supabase = await getServerSupabase();

View File

@@ -86,7 +86,6 @@ export function FloatingSafeModeToggle({
const { const {
currentHITLSafeMode, currentHITLSafeMode,
showHITLToggle, showHITLToggle,
isHITLStateUndetermined,
handleHITLToggle, handleHITLToggle,
currentSensitiveActionSafeMode, currentSensitiveActionSafeMode,
showSensitiveActionToggle, showSensitiveActionToggle,
@@ -99,16 +98,9 @@ export function FloatingSafeModeToggle({
return null; return null;
} }
const showHITL = showHITLToggle && !isHITLStateUndetermined;
const showSensitive = showSensitiveActionToggle;
if (!showHITL && !showSensitive) {
return null;
}
return ( return (
<div className={cn("fixed z-50 flex flex-col gap-2", className)}> <div className={cn("fixed z-50 flex flex-col gap-2", className)}>
{showHITL && ( {showHITLToggle && (
<SafeModeButton <SafeModeButton
isEnabled={currentHITLSafeMode} isEnabled={currentHITLSafeMode}
label="Human in the loop block approval" label="Human in the loop block approval"
@@ -119,7 +111,7 @@ export function FloatingSafeModeToggle({
fullWidth={fullWidth} fullWidth={fullWidth}
/> />
)} )}
{showSensitive && ( {showSensitiveActionToggle && (
<SafeModeButton <SafeModeButton
isEnabled={currentSensitiveActionSafeMode} isEnabled={currentSensitiveActionSafeMode}
label="Sensitive actions blocks approval" label="Sensitive actions blocks approval"

View File

@@ -0,0 +1,134 @@
"use client";
import { Button } from "@/components/atoms/Button/Button";
import { Text } from "@/components/atoms/Text/Text";
import { cn } from "@/lib/utils";
import { List } from "@phosphor-icons/react";
import React, { useState } from "react";
import { ChatContainer } from "./components/ChatContainer/ChatContainer";
import { ChatErrorState } from "./components/ChatErrorState/ChatErrorState";
import { ChatLoadingState } from "./components/ChatLoadingState/ChatLoadingState";
import { SessionsDrawer } from "./components/SessionsDrawer/SessionsDrawer";
import { useChat } from "./useChat";
export interface ChatProps {
className?: string;
headerTitle?: React.ReactNode;
showHeader?: boolean;
showSessionInfo?: boolean;
showNewChatButton?: boolean;
onNewChat?: () => void;
headerActions?: React.ReactNode;
}
export function Chat({
className,
headerTitle = "AutoGPT Copilot",
showHeader = true,
showSessionInfo = true,
showNewChatButton = true,
onNewChat,
headerActions,
}: ChatProps) {
const {
messages,
isLoading,
isCreating,
error,
sessionId,
createSession,
clearSession,
loadSession,
} = useChat();
const [isSessionsDrawerOpen, setIsSessionsDrawerOpen] = useState(false);
const handleNewChat = () => {
clearSession();
onNewChat?.();
};
const handleSelectSession = async (sessionId: string) => {
try {
await loadSession(sessionId);
} catch (err) {
console.error("Failed to load session:", err);
}
};
return (
<div className={cn("flex h-full flex-col", className)}>
{/* Header */}
{showHeader && (
<header className="shrink-0 border-t border-zinc-200 bg-white p-3">
<div className="flex items-center justify-between">
<div className="flex items-center gap-3">
<button
aria-label="View sessions"
onClick={() => setIsSessionsDrawerOpen(true)}
className="flex size-8 items-center justify-center rounded hover:bg-zinc-100"
>
<List width="1.25rem" height="1.25rem" />
</button>
{typeof headerTitle === "string" ? (
<Text variant="h2" className="text-lg font-semibold">
{headerTitle}
</Text>
) : (
headerTitle
)}
</div>
<div className="flex items-center gap-3">
{showSessionInfo && sessionId && (
<>
{showNewChatButton && (
<Button
variant="outline"
size="small"
onClick={handleNewChat}
>
New Chat
</Button>
)}
</>
)}
{headerActions}
</div>
</div>
</header>
)}
{/* Main Content */}
<main className="flex min-h-0 flex-1 flex-col overflow-hidden">
{/* Loading State - show when explicitly loading/creating OR when we don't have a session yet and no error */}
{(isLoading || isCreating || (!sessionId && !error)) && (
<ChatLoadingState
message={isCreating ? "Creating session..." : "Loading..."}
/>
)}
{/* Error State */}
{error && !isLoading && (
<ChatErrorState error={error} onRetry={createSession} />
)}
{/* Session Content */}
{sessionId && !isLoading && !error && (
<ChatContainer
sessionId={sessionId}
initialMessages={messages}
className="flex-1"
/>
)}
</main>
{/* Sessions Drawer */}
<SessionsDrawer
isOpen={isSessionsDrawerOpen}
onClose={() => setIsSessionsDrawerOpen(false)}
onSelectSession={handleSelectSession}
currentSessionId={sessionId}
/>
</div>
);
}

View File

@@ -21,7 +21,7 @@ export function AuthPromptWidget({
message, message,
sessionId, sessionId,
agentInfo, agentInfo,
returnUrl = "/copilot/chat", returnUrl = "/chat",
className, className,
}: AuthPromptWidgetProps) { }: AuthPromptWidgetProps) {
const router = useRouter(); const router = useRouter();

View File

@@ -0,0 +1,88 @@
import type { SessionDetailResponse } from "@/app/api/__generated__/models/sessionDetailResponse";
import { cn } from "@/lib/utils";
import { useCallback } from "react";
import { usePageContext } from "../../usePageContext";
import { ChatInput } from "../ChatInput/ChatInput";
import { MessageList } from "../MessageList/MessageList";
import { QuickActionsWelcome } from "../QuickActionsWelcome/QuickActionsWelcome";
import { useChatContainer } from "./useChatContainer";
export interface ChatContainerProps {
sessionId: string | null;
initialMessages: SessionDetailResponse["messages"];
className?: string;
}
export function ChatContainer({
sessionId,
initialMessages,
className,
}: ChatContainerProps) {
const { messages, streamingChunks, isStreaming, sendMessage } =
useChatContainer({
sessionId,
initialMessages,
});
const { capturePageContext } = usePageContext();
// Wrap sendMessage to automatically capture page context
const sendMessageWithContext = useCallback(
async (content: string, isUserMessage: boolean = true) => {
const context = capturePageContext();
await sendMessage(content, isUserMessage, context);
},
[sendMessage, capturePageContext],
);
const quickActions = [
"Find agents for social media management",
"Show me agents for content creation",
"Help me automate my business",
"What can you help me with?",
];
return (
<div
className={cn("flex h-full min-h-0 flex-col", className)}
style={{
backgroundColor: "#ffffff",
backgroundImage:
"radial-gradient(#e5e5e5 0.5px, transparent 0.5px), radial-gradient(#e5e5e5 0.5px, #ffffff 0.5px)",
backgroundSize: "20px 20px",
backgroundPosition: "0 0, 10px 10px",
}}
>
{/* Messages or Welcome Screen */}
<div className="flex min-h-0 flex-1 flex-col overflow-hidden pb-24">
{messages.length === 0 ? (
<QuickActionsWelcome
title="Welcome to AutoGPT Copilot"
description="Start a conversation to discover and run AI agents."
actions={quickActions}
onActionClick={sendMessageWithContext}
disabled={isStreaming || !sessionId}
/>
) : (
<MessageList
messages={messages}
streamingChunks={streamingChunks}
isStreaming={isStreaming}
onSendMessage={sendMessageWithContext}
className="flex-1"
/>
)}
</div>
{/* Input - Always visible */}
<div className="fixed bottom-0 left-0 right-0 z-50 border-t border-zinc-200 bg-white p-4">
<ChatInput
onSend={sendMessageWithContext}
disabled={isStreaming || !sessionId}
placeholder={
sessionId ? "Type your message..." : "Creating session..."
}
/>
</div>
</div>
);
}

View File

@@ -1,6 +1,6 @@
import { toast } from "sonner"; import { toast } from "sonner";
import { StreamChunk } from "../../useChatStream"; import { StreamChunk } from "../../useChatStream";
import type { HandlerDependencies } from "./handlers"; import type { HandlerDependencies } from "./useChatContainer.handlers";
import { import {
handleError, handleError,
handleLoginNeeded, handleLoginNeeded,
@@ -9,30 +9,12 @@ import {
handleTextEnded, handleTextEnded,
handleToolCallStart, handleToolCallStart,
handleToolResponse, handleToolResponse,
isRegionBlockedError, } from "./useChatContainer.handlers";
} from "./handlers";
export function createStreamEventDispatcher( export function createStreamEventDispatcher(
deps: HandlerDependencies, deps: HandlerDependencies,
): (chunk: StreamChunk) => void { ): (chunk: StreamChunk) => void {
return function dispatchStreamEvent(chunk: StreamChunk): void { return function dispatchStreamEvent(chunk: StreamChunk): void {
if (
chunk.type === "text_chunk" ||
chunk.type === "tool_call_start" ||
chunk.type === "tool_response" ||
chunk.type === "login_needed" ||
chunk.type === "need_login" ||
chunk.type === "error"
) {
if (!deps.hasResponseRef.current) {
console.info("[ChatStream] First response chunk:", {
type: chunk.type,
sessionId: deps.sessionId,
});
}
deps.hasResponseRef.current = true;
}
switch (chunk.type) { switch (chunk.type) {
case "text_chunk": case "text_chunk":
handleTextChunk(chunk, deps); handleTextChunk(chunk, deps);
@@ -56,23 +38,15 @@ export function createStreamEventDispatcher(
break; break;
case "stream_end": case "stream_end":
console.info("[ChatStream] Stream ended:", {
sessionId: deps.sessionId,
hasResponse: deps.hasResponseRef.current,
chunkCount: deps.streamingChunksRef.current.length,
});
handleStreamEnd(chunk, deps); handleStreamEnd(chunk, deps);
break; break;
case "error": case "error":
const isRegionBlocked = isRegionBlockedError(chunk);
handleError(chunk, deps); handleError(chunk, deps);
// Show toast at dispatcher level to avoid circular dependencies // Show toast at dispatcher level to avoid circular dependencies
if (!isRegionBlocked) { toast.error("Chat Error", {
toast.error("Chat Error", { description: chunk.message || chunk.content || "An error occurred",
description: chunk.message || chunk.content || "An error occurred", });
});
}
break; break;
case "usage": case "usage":

View File

@@ -1,33 +1,6 @@
import { SessionKey, sessionStorage } from "@/services/storage/session-storage";
import type { ToolResult } from "@/types/chat"; import type { ToolResult } from "@/types/chat";
import type { ChatMessageData } from "../ChatMessage/useChatMessage"; import type { ChatMessageData } from "../ChatMessage/useChatMessage";
export function hasSentInitialPrompt(sessionId: string): boolean {
try {
const sent = JSON.parse(
sessionStorage.get(SessionKey.CHAT_SENT_INITIAL_PROMPTS) || "{}",
);
return sent[sessionId] === true;
} catch {
return false;
}
}
export function markInitialPromptSent(sessionId: string): void {
try {
const sent = JSON.parse(
sessionStorage.get(SessionKey.CHAT_SENT_INITIAL_PROMPTS) || "{}",
);
sent[sessionId] = true;
sessionStorage.set(
SessionKey.CHAT_SENT_INITIAL_PROMPTS,
JSON.stringify(sent),
);
} catch {
// Ignore storage errors
}
}
export function removePageContext(content: string): string { export function removePageContext(content: string): string {
// Remove "Page URL: ..." pattern at start of line (case insensitive, handles various formats) // Remove "Page URL: ..." pattern at start of line (case insensitive, handles various formats)
let cleaned = content.replace(/^\s*Page URL:\s*[^\n\r]*/gim, ""); let cleaned = content.replace(/^\s*Page URL:\s*[^\n\r]*/gim, "");
@@ -234,22 +207,12 @@ export function parseToolResponse(
if (responseType === "setup_requirements") { if (responseType === "setup_requirements") {
return null; return null;
} }
if (responseType === "understanding_updated") {
return {
type: "tool_response",
toolId,
toolName,
result: (parsedResult || result) as ToolResult,
success: true,
timestamp: timestamp || new Date(),
};
}
} }
return { return {
type: "tool_response", type: "tool_response",
toolId, toolId,
toolName, toolName,
result: parsedResult ? (parsedResult as ToolResult) : result, result,
success: true, success: true,
timestamp: timestamp || new Date(), timestamp: timestamp || new Date(),
}; };

View File

@@ -7,30 +7,15 @@ import {
parseToolResponse, parseToolResponse,
} from "./helpers"; } from "./helpers";
function isToolCallMessage(
message: ChatMessageData,
): message is Extract<ChatMessageData, { type: "tool_call" }> {
return message.type === "tool_call";
}
export interface HandlerDependencies { export interface HandlerDependencies {
setHasTextChunks: Dispatch<SetStateAction<boolean>>; setHasTextChunks: Dispatch<SetStateAction<boolean>>;
setStreamingChunks: Dispatch<SetStateAction<string[]>>; setStreamingChunks: Dispatch<SetStateAction<string[]>>;
streamingChunksRef: MutableRefObject<string[]>; streamingChunksRef: MutableRefObject<string[]>;
hasResponseRef: MutableRefObject<boolean>;
setMessages: Dispatch<SetStateAction<ChatMessageData[]>>; setMessages: Dispatch<SetStateAction<ChatMessageData[]>>;
setIsStreamingInitiated: Dispatch<SetStateAction<boolean>>; setIsStreamingInitiated: Dispatch<SetStateAction<boolean>>;
setIsRegionBlockedModalOpen: Dispatch<SetStateAction<boolean>>;
sessionId: string; sessionId: string;
} }
export function isRegionBlockedError(chunk: StreamChunk): boolean {
if (chunk.code === "MODEL_NOT_AVAILABLE_REGION") return true;
const message = chunk.message || chunk.content;
if (typeof message !== "string") return false;
return message.toLowerCase().includes("not available in your region");
}
export function handleTextChunk(chunk: StreamChunk, deps: HandlerDependencies) { export function handleTextChunk(chunk: StreamChunk, deps: HandlerDependencies) {
if (!chunk.content) return; if (!chunk.content) return;
deps.setHasTextChunks(true); deps.setHasTextChunks(true);
@@ -45,17 +30,16 @@ export function handleTextEnded(
_chunk: StreamChunk, _chunk: StreamChunk,
deps: HandlerDependencies, deps: HandlerDependencies,
) { ) {
console.log("[Text Ended] Saving streamed text as assistant message");
const completedText = deps.streamingChunksRef.current.join(""); const completedText = deps.streamingChunksRef.current.join("");
if (completedText.trim()) { if (completedText.trim()) {
deps.setMessages((prev) => { const assistantMessage: ChatMessageData = {
const assistantMessage: ChatMessageData = { type: "message",
type: "message", role: "assistant",
role: "assistant", content: completedText,
content: completedText, timestamp: new Date(),
timestamp: new Date(), };
}; deps.setMessages((prev) => [...prev, assistantMessage]);
return [...prev, assistantMessage];
});
} }
deps.setStreamingChunks([]); deps.setStreamingChunks([]);
deps.streamingChunksRef.current = []; deps.streamingChunksRef.current = [];
@@ -66,45 +50,30 @@ export function handleToolCallStart(
chunk: StreamChunk, chunk: StreamChunk,
deps: HandlerDependencies, deps: HandlerDependencies,
) { ) {
const toolCallMessage: Extract<ChatMessageData, { type: "tool_call" }> = { const toolCallMessage: ChatMessageData = {
type: "tool_call", type: "tool_call",
toolId: chunk.tool_id || `tool-${Date.now()}-${chunk.idx || 0}`, toolId: chunk.tool_id || `tool-${Date.now()}-${chunk.idx || 0}`,
toolName: chunk.tool_name || "Executing", toolName: chunk.tool_name || "Executing...",
arguments: chunk.arguments || {}, arguments: chunk.arguments || {},
timestamp: new Date(), timestamp: new Date(),
}; };
deps.setMessages((prev) => [...prev, toolCallMessage]);
function updateToolCallMessages(prev: ChatMessageData[]) { console.log("[Tool Call Start]", {
const existingIndex = prev.findIndex(function findToolCallIndex(msg) { toolId: toolCallMessage.toolId,
return isToolCallMessage(msg) && msg.toolId === toolCallMessage.toolId; toolName: toolCallMessage.toolName,
}); timestamp: new Date().toISOString(),
if (existingIndex === -1) { });
return [...prev, toolCallMessage];
}
const nextMessages = [...prev];
const existing = nextMessages[existingIndex];
if (!isToolCallMessage(existing)) return prev;
const nextArguments =
toolCallMessage.arguments &&
Object.keys(toolCallMessage.arguments).length > 0
? toolCallMessage.arguments
: existing.arguments;
nextMessages[existingIndex] = {
...existing,
toolName: toolCallMessage.toolName || existing.toolName,
arguments: nextArguments,
timestamp: toolCallMessage.timestamp,
};
return nextMessages;
}
deps.setMessages(updateToolCallMessages);
} }
export function handleToolResponse( export function handleToolResponse(
chunk: StreamChunk, chunk: StreamChunk,
deps: HandlerDependencies, deps: HandlerDependencies,
) { ) {
console.log("[Tool Response] Received:", {
toolId: chunk.tool_id,
toolName: chunk.tool_name,
timestamp: new Date().toISOString(),
});
let toolName = chunk.tool_name || "unknown"; let toolName = chunk.tool_name || "unknown";
if (!chunk.tool_name || chunk.tool_name === "unknown") { if (!chunk.tool_name || chunk.tool_name === "unknown") {
deps.setMessages((prev) => { deps.setMessages((prev) => {
@@ -158,15 +127,22 @@ export function handleToolResponse(
const toolCallIndex = prev.findIndex( const toolCallIndex = prev.findIndex(
(msg) => msg.type === "tool_call" && msg.toolId === chunk.tool_id, (msg) => msg.type === "tool_call" && msg.toolId === chunk.tool_id,
); );
const hasResponse = prev.some(
(msg) => msg.type === "tool_response" && msg.toolId === chunk.tool_id,
);
if (hasResponse) return prev;
if (toolCallIndex !== -1) { if (toolCallIndex !== -1) {
const newMessages = [...prev]; const newMessages = [...prev];
newMessages.splice(toolCallIndex + 1, 0, responseMessage); newMessages[toolCallIndex] = responseMessage;
console.log(
"[Tool Response] Replaced tool_call with matching tool_id:",
chunk.tool_id,
"at index:",
toolCallIndex,
);
return newMessages; return newMessages;
} }
console.warn(
"[Tool Response] No tool_call found with tool_id:",
chunk.tool_id,
"appending instead",
);
return [...prev, responseMessage]; return [...prev, responseMessage];
}); });
} }
@@ -191,38 +167,55 @@ export function handleStreamEnd(
deps: HandlerDependencies, deps: HandlerDependencies,
) { ) {
const completedContent = deps.streamingChunksRef.current.join(""); const completedContent = deps.streamingChunksRef.current.join("");
if (!completedContent.trim() && !deps.hasResponseRef.current) { // Only save message if there are uncommitted chunks
deps.setMessages((prev) => [ // (text_ended already saved if there were tool calls)
...prev,
{
type: "message",
role: "assistant",
content: "No response received. Please try again.",
timestamp: new Date(),
},
]);
}
if (completedContent.trim()) { if (completedContent.trim()) {
console.log(
"[Stream End] Saving remaining streamed text as assistant message",
);
const assistantMessage: ChatMessageData = { const assistantMessage: ChatMessageData = {
type: "message", type: "message",
role: "assistant", role: "assistant",
content: completedContent, content: completedContent,
timestamp: new Date(), timestamp: new Date(),
}; };
deps.setMessages((prev) => [...prev, assistantMessage]); deps.setMessages((prev) => {
const updated = [...prev, assistantMessage];
console.log("[Stream End] Final state:", {
localMessages: updated.map((m) => ({
type: m.type,
...(m.type === "message" && {
role: m.role,
contentLength: m.content.length,
}),
...(m.type === "tool_call" && {
toolId: m.toolId,
toolName: m.toolName,
}),
...(m.type === "tool_response" && {
toolId: m.toolId,
toolName: m.toolName,
success: m.success,
}),
})),
streamingChunks: deps.streamingChunksRef.current,
timestamp: new Date().toISOString(),
});
return updated;
});
} else {
console.log("[Stream End] No uncommitted chunks, message already saved");
} }
deps.setStreamingChunks([]); deps.setStreamingChunks([]);
deps.streamingChunksRef.current = []; deps.streamingChunksRef.current = [];
deps.setHasTextChunks(false); deps.setHasTextChunks(false);
deps.setIsStreamingInitiated(false); deps.setIsStreamingInitiated(false);
console.log("[Stream End] Stream complete, messages in local state");
} }
export function handleError(chunk: StreamChunk, deps: HandlerDependencies) { export function handleError(chunk: StreamChunk, deps: HandlerDependencies) {
const errorMessage = chunk.message || chunk.content || "An error occurred"; const errorMessage = chunk.message || chunk.content || "An error occurred";
console.error("Stream error:", errorMessage); console.error("Stream error:", errorMessage);
if (isRegionBlockedError(chunk)) {
deps.setIsRegionBlockedModalOpen(true);
}
deps.setIsStreamingInitiated(false); deps.setIsStreamingInitiated(false);
deps.setHasTextChunks(false); deps.setHasTextChunks(false);
deps.setStreamingChunks([]); deps.setStreamingChunks([]);

View File

@@ -1,17 +1,14 @@
import type { SessionDetailResponse } from "@/app/api/__generated__/models/sessionDetailResponse"; import type { SessionDetailResponse } from "@/app/api/__generated__/models/sessionDetailResponse";
import { useCallback, useEffect, useMemo, useRef, useState } from "react"; import { useCallback, useMemo, useRef, useState } from "react";
import { toast } from "sonner"; import { toast } from "sonner";
import { useChatStream } from "../../useChatStream"; import { useChatStream } from "../../useChatStream";
import { usePageContext } from "../../usePageContext";
import type { ChatMessageData } from "../ChatMessage/useChatMessage"; import type { ChatMessageData } from "../ChatMessage/useChatMessage";
import { createStreamEventDispatcher } from "./createStreamEventDispatcher"; import { createStreamEventDispatcher } from "./createStreamEventDispatcher";
import { import {
createUserMessage, createUserMessage,
filterAuthMessages, filterAuthMessages,
hasSentInitialPrompt,
isToolCallArray, isToolCallArray,
isValidMessage, isValidMessage,
markInitialPromptSent,
parseToolResponse, parseToolResponse,
removePageContext, removePageContext,
} from "./helpers"; } from "./helpers";
@@ -19,45 +16,20 @@ import {
interface Args { interface Args {
sessionId: string | null; sessionId: string | null;
initialMessages: SessionDetailResponse["messages"]; initialMessages: SessionDetailResponse["messages"];
initialPrompt?: string;
} }
export function useChatContainer({ export function useChatContainer({ sessionId, initialMessages }: Args) {
sessionId,
initialMessages,
initialPrompt,
}: Args) {
const [messages, setMessages] = useState<ChatMessageData[]>([]); const [messages, setMessages] = useState<ChatMessageData[]>([]);
const [streamingChunks, setStreamingChunks] = useState<string[]>([]); const [streamingChunks, setStreamingChunks] = useState<string[]>([]);
const [hasTextChunks, setHasTextChunks] = useState(false); const [hasTextChunks, setHasTextChunks] = useState(false);
const [isStreamingInitiated, setIsStreamingInitiated] = useState(false); const [isStreamingInitiated, setIsStreamingInitiated] = useState(false);
const [isRegionBlockedModalOpen, setIsRegionBlockedModalOpen] =
useState(false);
const hasResponseRef = useRef(false);
const streamingChunksRef = useRef<string[]>([]); const streamingChunksRef = useRef<string[]>([]);
const previousSessionIdRef = useRef<string | null>(null); const { error, sendMessage: sendStreamMessage } = useChatStream();
const {
error,
sendMessage: sendStreamMessage,
stopStreaming,
} = useChatStream();
const isStreaming = isStreamingInitiated || hasTextChunks; const isStreaming = isStreamingInitiated || hasTextChunks;
useEffect(() => {
if (sessionId !== previousSessionIdRef.current) {
stopStreaming(previousSessionIdRef.current ?? undefined, true);
previousSessionIdRef.current = sessionId;
setMessages([]);
setStreamingChunks([]);
streamingChunksRef.current = [];
setHasTextChunks(false);
setIsStreamingInitiated(false);
hasResponseRef.current = false;
}
}, [sessionId, stopStreaming]);
const allMessages = useMemo(() => { const allMessages = useMemo(() => {
const processedInitialMessages: ChatMessageData[] = []; const processedInitialMessages: ChatMessageData[] = [];
// Map to track tool calls by their ID so we can look up tool names for tool responses
const toolCallMap = new Map<string, string>(); const toolCallMap = new Map<string, string>();
for (const msg of initialMessages) { for (const msg of initialMessages) {
@@ -73,9 +45,13 @@ export function useChatContainer({
? new Date(msg.timestamp as string) ? new Date(msg.timestamp as string)
: undefined; : undefined;
// Remove page context from user messages when loading existing sessions
if (role === "user") { if (role === "user") {
content = removePageContext(content); content = removePageContext(content);
if (!content.trim()) continue; // Skip user messages that become empty after removing page context
if (!content.trim()) {
continue;
}
processedInitialMessages.push({ processedInitialMessages.push({
type: "message", type: "message",
role: "user", role: "user",
@@ -85,15 +61,19 @@ export function useChatContainer({
continue; continue;
} }
// Handle assistant messages first (before tool messages) to build tool call map
if (role === "assistant") { if (role === "assistant") {
// Strip <thinking> tags from content
content = content content = content
.replace(/<thinking>[\s\S]*?<\/thinking>/gi, "") .replace(/<thinking>[\s\S]*?<\/thinking>/gi, "")
.trim(); .trim();
// If assistant has tool calls, create tool_call messages for each
if (toolCalls && isToolCallArray(toolCalls) && toolCalls.length > 0) { if (toolCalls && isToolCallArray(toolCalls) && toolCalls.length > 0) {
for (const toolCall of toolCalls) { for (const toolCall of toolCalls) {
const toolName = toolCall.function.name; const toolName = toolCall.function.name;
const toolId = toolCall.id; const toolId = toolCall.id;
// Store tool name for later lookup
toolCallMap.set(toolId, toolName); toolCallMap.set(toolId, toolName);
try { try {
@@ -116,6 +96,7 @@ export function useChatContainer({
}); });
} }
} }
// Only add assistant message if there's content after stripping thinking tags
if (content.trim()) { if (content.trim()) {
processedInitialMessages.push({ processedInitialMessages.push({
type: "message", type: "message",
@@ -125,6 +106,7 @@ export function useChatContainer({
}); });
} }
} else if (content.trim()) { } else if (content.trim()) {
// Assistant message without tool calls, but with content
processedInitialMessages.push({ processedInitialMessages.push({
type: "message", type: "message",
role: "assistant", role: "assistant",
@@ -135,6 +117,7 @@ export function useChatContainer({
continue; continue;
} }
// Handle tool messages - look up tool name from tool call map
if (role === "tool") { if (role === "tool") {
const toolCallId = (msg.tool_call_id as string) || ""; const toolCallId = (msg.tool_call_id as string) || "";
const toolName = toolCallMap.get(toolCallId) || "unknown"; const toolName = toolCallMap.get(toolCallId) || "unknown";
@@ -150,6 +133,7 @@ export function useChatContainer({
continue; continue;
} }
// Handle other message types (system, etc.)
if (content.trim()) { if (content.trim()) {
processedInitialMessages.push({ processedInitialMessages.push({
type: "message", type: "message",
@@ -170,10 +154,9 @@ export function useChatContainer({
context?: { url: string; content: string }, context?: { url: string; content: string },
) { ) {
if (!sessionId) { if (!sessionId) {
console.error("[useChatContainer] Cannot send message: no session ID"); console.error("Cannot send message: no session ID");
return; return;
} }
setIsRegionBlockedModalOpen(false);
if (isUserMessage) { if (isUserMessage) {
const userMessage = createUserMessage(content); const userMessage = createUserMessage(content);
setMessages((prev) => [...filterAuthMessages(prev), userMessage]); setMessages((prev) => [...filterAuthMessages(prev), userMessage]);
@@ -184,19 +167,14 @@ export function useChatContainer({
streamingChunksRef.current = []; streamingChunksRef.current = [];
setHasTextChunks(false); setHasTextChunks(false);
setIsStreamingInitiated(true); setIsStreamingInitiated(true);
hasResponseRef.current = false;
const dispatcher = createStreamEventDispatcher({ const dispatcher = createStreamEventDispatcher({
setHasTextChunks, setHasTextChunks,
setStreamingChunks, setStreamingChunks,
streamingChunksRef, streamingChunksRef,
hasResponseRef,
setMessages, setMessages,
setIsRegionBlockedModalOpen,
sessionId, sessionId,
setIsStreamingInitiated, setIsStreamingInitiated,
}); });
try { try {
await sendStreamMessage( await sendStreamMessage(
sessionId, sessionId,
@@ -206,12 +184,8 @@ export function useChatContainer({
context, context,
); );
} catch (err) { } catch (err) {
console.error("[useChatContainer] Failed to send message:", err); console.error("Failed to send message:", err);
setIsStreamingInitiated(false); setIsStreamingInitiated(false);
// Don't show error toast for AbortError (expected during cleanup)
if (err instanceof Error && err.name === "AbortError") return;
const errorMessage = const errorMessage =
err instanceof Error ? err.message : "Failed to send message"; err instanceof Error ? err.message : "Failed to send message";
toast.error("Failed to send message", { toast.error("Failed to send message", {
@@ -222,63 +196,11 @@ export function useChatContainer({
[sessionId, sendStreamMessage], [sessionId, sendStreamMessage],
); );
const handleStopStreaming = useCallback(() => {
stopStreaming();
setStreamingChunks([]);
streamingChunksRef.current = [];
setHasTextChunks(false);
setIsStreamingInitiated(false);
}, [stopStreaming]);
const { capturePageContext } = usePageContext();
// Send initial prompt if provided (for new sessions from homepage)
useEffect(
function handleInitialPrompt() {
if (!initialPrompt || !sessionId) return;
if (initialMessages.length > 0) return;
if (hasSentInitialPrompt(sessionId)) return;
markInitialPromptSent(sessionId);
const context = capturePageContext();
sendMessage(initialPrompt, true, context);
},
[
initialPrompt,
sessionId,
initialMessages.length,
sendMessage,
capturePageContext,
],
);
async function sendMessageWithContext(
content: string,
isUserMessage: boolean = true,
) {
const context = capturePageContext();
await sendMessage(content, isUserMessage, context);
}
function handleRegionModalOpenChange(open: boolean) {
setIsRegionBlockedModalOpen(open);
}
function handleRegionModalClose() {
setIsRegionBlockedModalOpen(false);
}
return { return {
messages: allMessages, messages: allMessages,
streamingChunks, streamingChunks,
isStreaming, isStreaming,
error, error,
isRegionBlockedModalOpen,
setIsRegionBlockedModalOpen,
sendMessageWithContext,
handleRegionModalOpenChange,
handleRegionModalClose,
sendMessage, sendMessage,
stopStreaming: handleStopStreaming,
}; };
} }

View File

@@ -0,0 +1,64 @@
import { Input } from "@/components/atoms/Input/Input";
import { cn } from "@/lib/utils";
import { ArrowUpIcon } from "@phosphor-icons/react";
import { useChatInput } from "./useChatInput";
export interface ChatInputProps {
onSend: (message: string) => void;
disabled?: boolean;
placeholder?: string;
className?: string;
}
export function ChatInput({
onSend,
disabled = false,
placeholder = "Type your message...",
className,
}: ChatInputProps) {
const inputId = "chat-input";
const { value, setValue, handleKeyDown, handleSend } = useChatInput({
onSend,
disabled,
maxRows: 5,
inputId,
});
return (
<div className={cn("relative flex-1", className)}>
<Input
id={inputId}
label="Chat message input"
hideLabel
type="textarea"
value={value}
onChange={(e) => setValue(e.target.value)}
onKeyDown={handleKeyDown}
placeholder={placeholder}
disabled={disabled}
rows={1}
wrapperClassName="mb-0 relative"
className="pr-12"
/>
<span id="chat-input-hint" className="sr-only">
Press Enter to send, Shift+Enter for new line
</span>
<button
onClick={handleSend}
disabled={disabled || !value.trim()}
className={cn(
"absolute right-3 top-1/2 flex h-8 w-8 -translate-y-1/2 items-center justify-center rounded-full",
"border border-zinc-800 bg-zinc-800 text-white",
"hover:border-zinc-900 hover:bg-zinc-900",
"disabled:border-zinc-200 disabled:bg-zinc-200 disabled:text-white disabled:opacity-50",
"transition-colors focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-neutral-950",
"disabled:pointer-events-none",
)}
aria-label="Send message"
>
<ArrowUpIcon className="h-3 w-3" weight="bold" />
</button>
</div>
);
}

View File

@@ -0,0 +1,60 @@
import { KeyboardEvent, useCallback, useEffect, useState } from "react";
interface UseChatInputArgs {
onSend: (message: string) => void;
disabled?: boolean;
maxRows?: number;
inputId?: string;
}
export function useChatInput({
onSend,
disabled = false,
maxRows = 5,
inputId = "chat-input",
}: UseChatInputArgs) {
const [value, setValue] = useState("");
useEffect(() => {
const textarea = document.getElementById(inputId) as HTMLTextAreaElement;
if (!textarea) return;
textarea.style.height = "auto";
const lineHeight = parseInt(
window.getComputedStyle(textarea).lineHeight,
10,
);
const maxHeight = lineHeight * maxRows;
const newHeight = Math.min(textarea.scrollHeight, maxHeight);
textarea.style.height = `${newHeight}px`;
textarea.style.overflowY =
textarea.scrollHeight > maxHeight ? "auto" : "hidden";
}, [value, maxRows, inputId]);
const handleSend = useCallback(() => {
if (disabled || !value.trim()) return;
onSend(value.trim());
setValue("");
const textarea = document.getElementById(inputId) as HTMLTextAreaElement;
if (textarea) {
textarea.style.height = "auto";
}
}, [value, onSend, disabled, inputId]);
const handleKeyDown = useCallback(
(event: KeyboardEvent<HTMLInputElement | HTMLTextAreaElement>) => {
if (event.key === "Enter" && !event.shiftKey) {
event.preventDefault();
handleSend();
}
// Shift+Enter allows default behavior (new line) - no need to handle explicitly
},
[handleSend],
);
return {
value,
setValue,
handleKeyDown,
handleSend,
};
}

View File

@@ -1,65 +1,48 @@
"use client"; "use client";
import { useGetV2GetUserProfile } from "@/app/api/__generated__/endpoints/store/store";
import Avatar, {
AvatarFallback,
AvatarImage,
} from "@/components/atoms/Avatar/Avatar";
import { Button } from "@/components/atoms/Button/Button"; import { Button } from "@/components/atoms/Button/Button";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase"; import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { import {
ArrowsClockwiseIcon, ArrowClockwise,
CheckCircleIcon, CheckCircleIcon,
CheckIcon, CheckIcon,
CopyIcon, CopyIcon,
RobotIcon,
} from "@phosphor-icons/react"; } from "@phosphor-icons/react";
import { useRouter } from "next/navigation"; import { useRouter } from "next/navigation";
import { useCallback, useState } from "react"; import { useCallback, useState } from "react";
import { getToolActionPhrase } from "../../helpers";
import { AgentCarouselMessage } from "../AgentCarouselMessage/AgentCarouselMessage"; import { AgentCarouselMessage } from "../AgentCarouselMessage/AgentCarouselMessage";
import { AIChatBubble } from "../AIChatBubble/AIChatBubble";
import { AuthPromptWidget } from "../AuthPromptWidget/AuthPromptWidget"; import { AuthPromptWidget } from "../AuthPromptWidget/AuthPromptWidget";
import { ChatCredentialsSetup } from "../ChatCredentialsSetup/ChatCredentialsSetup"; import { ChatCredentialsSetup } from "../ChatCredentialsSetup/ChatCredentialsSetup";
import { ExecutionStartedMessage } from "../ExecutionStartedMessage/ExecutionStartedMessage"; import { ExecutionStartedMessage } from "../ExecutionStartedMessage/ExecutionStartedMessage";
import { MarkdownContent } from "../MarkdownContent/MarkdownContent"; import { MarkdownContent } from "../MarkdownContent/MarkdownContent";
import { MessageBubble } from "../MessageBubble/MessageBubble";
import { NoResultsMessage } from "../NoResultsMessage/NoResultsMessage"; import { NoResultsMessage } from "../NoResultsMessage/NoResultsMessage";
import { ToolCallMessage } from "../ToolCallMessage/ToolCallMessage"; import { ToolCallMessage } from "../ToolCallMessage/ToolCallMessage";
import { ToolResponseMessage } from "../ToolResponseMessage/ToolResponseMessage"; import { ToolResponseMessage } from "../ToolResponseMessage/ToolResponseMessage";
import { UserChatBubble } from "../UserChatBubble/UserChatBubble";
import { useChatMessage, type ChatMessageData } from "./useChatMessage"; import { useChatMessage, type ChatMessageData } from "./useChatMessage";
function stripInternalReasoning(content: string): string {
const cleaned = content.replace(
/<internal_reasoning>[\s\S]*?<\/internal_reasoning>/gi,
"",
);
return cleaned.replace(/\n{3,}/g, "\n\n").trim();
}
function getDisplayContent(message: ChatMessageData, isUser: boolean): string {
if (message.type !== "message") return "";
if (isUser) return message.content;
return stripInternalReasoning(message.content);
}
export interface ChatMessageProps { export interface ChatMessageProps {
message: ChatMessageData; message: ChatMessageData;
messages?: ChatMessageData[];
index?: number;
isStreaming?: boolean;
className?: string; className?: string;
onDismissLogin?: () => void; onDismissLogin?: () => void;
onDismissCredentials?: () => void; onDismissCredentials?: () => void;
onSendMessage?: (content: string, isUserMessage?: boolean) => void; onSendMessage?: (content: string, isUserMessage?: boolean) => void;
agentOutput?: ChatMessageData; agentOutput?: ChatMessageData;
isFinalMessage?: boolean;
} }
export function ChatMessage({ export function ChatMessage({
message, message,
messages = [],
index = -1,
isStreaming = false,
className, className,
onDismissCredentials, onDismissCredentials,
onSendMessage, onSendMessage,
agentOutput, agentOutput,
isFinalMessage = true,
}: ChatMessageProps) { }: ChatMessageProps) {
const { user } = useSupabase(); const { user } = useSupabase();
const router = useRouter(); const router = useRouter();
@@ -71,7 +54,14 @@ export function ChatMessage({
isLoginNeeded, isLoginNeeded,
isCredentialsNeeded, isCredentialsNeeded,
} = useChatMessage(message); } = useChatMessage(message);
const displayContent = getDisplayContent(message, isUser);
const { data: profile } = useGetV2GetUserProfile({
query: {
select: (res) => (res.status === 200 ? res.data : null),
enabled: isUser && !!user,
queryKey: ["/api/store/profile", user?.id],
},
});
const handleAllCredentialsComplete = useCallback( const handleAllCredentialsComplete = useCallback(
function handleAllCredentialsComplete() { function handleAllCredentialsComplete() {
@@ -97,25 +87,17 @@ export function ChatMessage({
} }
} }
const handleCopy = useCallback( const handleCopy = useCallback(async () => {
async function handleCopy() { if (message.type !== "message") return;
if (message.type !== "message") return;
if (!displayContent) return;
try { try {
await navigator.clipboard.writeText(displayContent); await navigator.clipboard.writeText(message.content);
setCopied(true); setCopied(true);
setTimeout(() => setCopied(false), 2000); setTimeout(() => setCopied(false), 2000);
} catch (error) { } catch (error) {
console.error("Failed to copy:", error); console.error("Failed to copy:", error);
} }
}, }, [message]);
[displayContent, message],
);
function isLongResponse(content: string): boolean {
return content.split("\n").length > 5;
}
const handleTryAgain = useCallback(() => { const handleTryAgain = useCallback(() => {
if (message.type !== "message" || !onSendMessage) return; if (message.type !== "message" || !onSendMessage) return;
@@ -187,45 +169,9 @@ export function ChatMessage({
// Render tool call messages // Render tool call messages
if (isToolCall && message.type === "tool_call") { if (isToolCall && message.type === "tool_call") {
// Check if this tool call is currently streaming
// A tool call is streaming if:
// 1. isStreaming is true
// 2. This is the last tool_call message
// 3. There's no tool_response for this tool call yet
const isToolCallStreaming =
isStreaming &&
index >= 0 &&
(() => {
// Find the last tool_call index
let lastToolCallIndex = -1;
for (let i = messages.length - 1; i >= 0; i--) {
if (messages[i].type === "tool_call") {
lastToolCallIndex = i;
break;
}
}
// Check if this is the last tool_call and there's no response yet
if (index === lastToolCallIndex) {
// Check if there's a tool_response for this tool call
const hasResponse = messages
.slice(index + 1)
.some(
(msg) =>
msg.type === "tool_response" && msg.toolId === message.toolId,
);
return !hasResponse;
}
return false;
})();
return ( return (
<div className={cn("px-4 py-2", className)}> <div className={cn("px-4 py-2", className)}>
<ToolCallMessage <ToolCallMessage toolName={message.toolName} />
toolId={message.toolId}
toolName={message.toolName}
arguments={message.arguments}
isStreaming={isToolCallStreaming}
/>
</div> </div>
); );
} }
@@ -272,11 +218,27 @@ export function ChatMessage({
// Render tool response messages (but skip agent_output if it's being rendered inside assistant message) // Render tool response messages (but skip agent_output if it's being rendered inside assistant message)
if (isToolResponse && message.type === "tool_response") { if (isToolResponse && message.type === "tool_response") {
// Check if this is an agent_output that should be rendered inside assistant message
if (message.result) {
let parsedResult: Record<string, unknown> | null = null;
try {
parsedResult =
typeof message.result === "string"
? JSON.parse(message.result)
: (message.result as Record<string, unknown>);
} catch {
parsedResult = null;
}
if (parsedResult?.type === "agent_output") {
// Skip rendering - this will be rendered inside the assistant message
return null;
}
}
return ( return (
<div className={cn("px-4 py-2", className)}> <div className={cn("px-4 py-2", className)}>
<ToolResponseMessage <ToolResponseMessage
toolId={message.toolId} toolName={getToolActionPhrase(message.toolName)}
toolName={message.toolName}
result={message.result} result={message.result}
/> />
</div> </div>
@@ -294,33 +256,40 @@ export function ChatMessage({
)} )}
> >
<div className="flex w-full max-w-3xl gap-3"> <div className="flex w-full max-w-3xl gap-3">
{!isUser && (
<div className="flex-shrink-0">
<div className="flex h-7 w-7 items-center justify-center rounded-lg bg-indigo-500">
<RobotIcon className="h-4 w-4 text-indigo-50" />
</div>
</div>
)}
<div <div
className={cn( className={cn(
"flex min-w-0 flex-1 flex-col", "flex min-w-0 flex-1 flex-col",
isUser && "items-end", isUser && "items-end",
)} )}
> >
{isUser ? ( <MessageBubble variant={isUser ? "user" : "assistant"}>
<UserChatBubble> <MarkdownContent content={message.content} />
<MarkdownContent content={displayContent} /> {agentOutput &&
</UserChatBubble> agentOutput.type === "tool_response" &&
) : ( !isUser && (
<AIChatBubble>
<MarkdownContent content={displayContent} />
{agentOutput && agentOutput.type === "tool_response" && (
<div className="mt-4"> <div className="mt-4">
<ToolResponseMessage <ToolResponseMessage
toolId={agentOutput.toolId} toolName={
toolName={agentOutput.toolName || "Agent Output"} agentOutput.toolName
? getToolActionPhrase(agentOutput.toolName)
: "Agent Output"
}
result={agentOutput.result} result={agentOutput.result}
/> />
</div> </div>
)} )}
</AIChatBubble> </MessageBubble>
)}
<div <div
className={cn( className={cn(
"flex gap-0", "mt-1 flex gap-1",
isUser ? "justify-end" : "justify-start", isUser ? "justify-end" : "justify-start",
)} )}
> >
@@ -331,25 +300,37 @@ export function ChatMessage({
onClick={handleTryAgain} onClick={handleTryAgain}
aria-label="Try again" aria-label="Try again"
> >
<ArrowsClockwiseIcon className="size-4 text-zinc-600" /> <ArrowClockwise className="size-3 text-neutral-500" />
</Button>
)}
{!isUser && isFinalMessage && isLongResponse(displayContent) && (
<Button
variant="ghost"
size="icon"
onClick={handleCopy}
aria-label="Copy message"
>
{copied ? (
<CheckIcon className="size-4 text-green-600" />
) : (
<CopyIcon className="size-4 text-zinc-600" />
)}
</Button> </Button>
)} )}
<Button
variant="ghost"
size="icon"
onClick={handleCopy}
aria-label="Copy message"
>
{copied ? (
<CheckIcon className="size-3 text-green-600" />
) : (
<CopyIcon className="size-3 text-neutral-500" />
)}
</Button>
</div> </div>
</div> </div>
{isUser && (
<div className="flex-shrink-0">
<Avatar className="h-7 w-7">
<AvatarImage
src={profile?.avatar_url ?? ""}
alt={profile?.username ?? "User"}
/>
<AvatarFallback className="rounded-lg bg-neutral-200 text-neutral-600">
{profile?.username?.charAt(0)?.toUpperCase() || "U"}
</AvatarFallback>
</Avatar>
</div>
)}
</div> </div>
</div> </div>
); );

View File

@@ -13,9 +13,10 @@ export function MessageBubble({
className, className,
}: MessageBubbleProps) { }: MessageBubbleProps) {
const userTheme = { const userTheme = {
bg: "bg-purple-100", bg: "bg-slate-900",
border: "border-purple-100", border: "border-slate-800",
text: "text-slate-900", gradient: "from-slate-900/30 via-slate-800/20 to-transparent",
text: "text-slate-50",
}; };
const assistantTheme = { const assistantTheme = {
@@ -39,7 +40,9 @@ export function MessageBubble({
)} )}
> >
{/* Gradient flare background */} {/* Gradient flare background */}
<div className={cn("absolute inset-0 bg-gradient-to-br")} /> <div
className={cn("absolute inset-0 bg-gradient-to-br", theme.gradient)}
/>
<div <div
className={cn( className={cn(
"relative z-10 transition-all duration-500 ease-in-out", "relative z-10 transition-all duration-500 ease-in-out",

View File

@@ -0,0 +1,121 @@
"use client";
import { cn } from "@/lib/utils";
import { ChatMessage } from "../ChatMessage/ChatMessage";
import type { ChatMessageData } from "../ChatMessage/useChatMessage";
import { StreamingMessage } from "../StreamingMessage/StreamingMessage";
import { ThinkingMessage } from "../ThinkingMessage/ThinkingMessage";
import { useMessageList } from "./useMessageList";
export interface MessageListProps {
messages: ChatMessageData[];
streamingChunks?: string[];
isStreaming?: boolean;
className?: string;
onStreamComplete?: () => void;
onSendMessage?: (content: string) => void;
}
export function MessageList({
messages,
streamingChunks = [],
isStreaming = false,
className,
onStreamComplete,
onSendMessage,
}: MessageListProps) {
const { messagesEndRef, messagesContainerRef } = useMessageList({
messageCount: messages.length,
isStreaming,
});
return (
<div
ref={messagesContainerRef}
className={cn(
"flex-1 overflow-y-auto",
"scrollbar-thin scrollbar-track-transparent scrollbar-thumb-zinc-300",
className,
)}
>
<div className="mx-auto flex max-w-3xl flex-col py-4">
{/* Render all persisted messages */}
{messages.map((message, index) => {
// Check if current message is an agent_output tool_response
// and if previous message is an assistant message
let agentOutput: ChatMessageData | undefined;
if (message.type === "tool_response" && message.result) {
let parsedResult: Record<string, unknown> | null = null;
try {
parsedResult =
typeof message.result === "string"
? JSON.parse(message.result)
: (message.result as Record<string, unknown>);
} catch {
parsedResult = null;
}
if (parsedResult?.type === "agent_output") {
const prevMessage = messages[index - 1];
if (
prevMessage &&
prevMessage.type === "message" &&
prevMessage.role === "assistant"
) {
// This agent output will be rendered inside the previous assistant message
// Skip rendering this message separately
return null;
}
}
}
// Check if next message is an agent_output tool_response to include in current assistant message
if (message.type === "message" && message.role === "assistant") {
const nextMessage = messages[index + 1];
if (
nextMessage &&
nextMessage.type === "tool_response" &&
nextMessage.result
) {
let parsedResult: Record<string, unknown> | null = null;
try {
parsedResult =
typeof nextMessage.result === "string"
? JSON.parse(nextMessage.result)
: (nextMessage.result as Record<string, unknown>);
} catch {
parsedResult = null;
}
if (parsedResult?.type === "agent_output") {
agentOutput = nextMessage;
}
}
}
return (
<ChatMessage
key={index}
message={message}
onSendMessage={onSendMessage}
agentOutput={agentOutput}
/>
);
})}
{/* Render thinking message when streaming but no chunks yet */}
{isStreaming && streamingChunks.length === 0 && <ThinkingMessage />}
{/* Render streaming message if active */}
{isStreaming && streamingChunks.length > 0 && (
<StreamingMessage
chunks={streamingChunks}
onComplete={onStreamComplete}
/>
)}
{/* Invisible div to scroll to */}
<div ref={messagesEndRef} />
</div>
</div>
);
}

View File

@@ -81,9 +81,9 @@ export function SessionsDrawer({
</Text> </Text>
</div> </div>
) : sessions.length === 0 ? ( ) : sessions.length === 0 ? (
<div className="flex h-full items-center justify-center"> <div className="flex items-center justify-center py-8">
<Text variant="body" className="text-zinc-500"> <Text variant="body" className="text-zinc-500">
You don&apos;t have previously started chats No sessions found
</Text> </Text>
</div> </div>
) : ( ) : (

View File

@@ -1,6 +1,7 @@
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { AIChatBubble } from "../AIChatBubble/AIChatBubble"; import { RobotIcon } from "@phosphor-icons/react";
import { MarkdownContent } from "../MarkdownContent/MarkdownContent"; import { MarkdownContent } from "../MarkdownContent/MarkdownContent";
import { MessageBubble } from "../MessageBubble/MessageBubble";
import { useStreamingMessage } from "./useStreamingMessage"; import { useStreamingMessage } from "./useStreamingMessage";
export interface StreamingMessageProps { export interface StreamingMessageProps {
@@ -24,10 +25,16 @@ export function StreamingMessage({
)} )}
> >
<div className="flex w-full max-w-3xl gap-3"> <div className="flex w-full max-w-3xl gap-3">
<div className="flex-shrink-0">
<div className="flex h-7 w-7 items-center justify-center rounded-lg bg-indigo-600">
<RobotIcon className="h-4 w-4 text-indigo-50" />
</div>
</div>
<div className="flex min-w-0 flex-1 flex-col"> <div className="flex min-w-0 flex-1 flex-col">
<AIChatBubble> <MessageBubble variant="assistant">
<MarkdownContent content={displayText} /> <MarkdownContent content={displayText} />
</AIChatBubble> </MessageBubble>
</div> </div>
</div> </div>
</div> </div>

View File

@@ -1,7 +1,7 @@
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { RobotIcon } from "@phosphor-icons/react";
import { useEffect, useRef, useState } from "react"; import { useEffect, useRef, useState } from "react";
import { AIChatBubble } from "../AIChatBubble/AIChatBubble"; import { MessageBubble } from "../MessageBubble/MessageBubble";
import { ChatLoader } from "../ChatLoader/ChatLoader";
export interface ThinkingMessageProps { export interface ThinkingMessageProps {
className?: string; className?: string;
@@ -34,11 +34,22 @@ export function ThinkingMessage({ className }: ThinkingMessageProps) {
)} )}
> >
<div className="flex w-full max-w-3xl gap-3"> <div className="flex w-full max-w-3xl gap-3">
<div className="flex-shrink-0">
<div className="flex h-7 w-7 items-center justify-center rounded-lg bg-indigo-500">
<RobotIcon className="h-4 w-4 text-indigo-50" />
</div>
</div>
<div className="flex min-w-0 flex-1 flex-col"> <div className="flex min-w-0 flex-1 flex-col">
<AIChatBubble> <MessageBubble variant="assistant">
<div className="transition-all duration-500 ease-in-out"> <div className="transition-all duration-500 ease-in-out">
{showSlowLoader ? ( {showSlowLoader ? (
<ChatLoader /> <div className="flex flex-col items-center gap-3 py-2">
<div className="loader" style={{ flexShrink: 0 }} />
<p className="text-sm text-slate-700">
Taking a bit longer to think, wait a moment please
</p>
</div>
) : ( ) : (
<span <span
className="inline-block bg-gradient-to-r from-neutral-400 via-neutral-600 to-neutral-400 bg-clip-text text-transparent" className="inline-block bg-gradient-to-r from-neutral-400 via-neutral-600 to-neutral-400 bg-clip-text text-transparent"
@@ -51,7 +62,7 @@ export function ThinkingMessage({ className }: ThinkingMessageProps) {
</span> </span>
)} )}
</div> </div>
</AIChatBubble> </MessageBubble>
</div> </div>
</div> </div>
</div> </div>

View File

@@ -0,0 +1,24 @@
import { Text } from "@/components/atoms/Text/Text";
import { cn } from "@/lib/utils";
import { WrenchIcon } from "@phosphor-icons/react";
import { getToolActionPhrase } from "../../helpers";
export interface ToolCallMessageProps {
toolName: string;
className?: string;
}
export function ToolCallMessage({ toolName, className }: ToolCallMessageProps) {
return (
<div className={cn("flex items-center justify-center gap-2", className)}>
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{getToolActionPhrase(toolName)}...
</Text>
</div>
);
}

View File

@@ -0,0 +1,260 @@
import { Text } from "@/components/atoms/Text/Text";
import "@/components/contextual/OutputRenderers";
import {
globalRegistry,
OutputItem,
} from "@/components/contextual/OutputRenderers";
import { cn } from "@/lib/utils";
import type { ToolResult } from "@/types/chat";
import { WrenchIcon } from "@phosphor-icons/react";
import { getToolActionPhrase } from "../../helpers";
export interface ToolResponseMessageProps {
toolName: string;
result?: ToolResult;
success?: boolean;
className?: string;
}
export function ToolResponseMessage({
toolName,
result,
success: _success = true,
className,
}: ToolResponseMessageProps) {
if (!result) {
return (
<div className={cn("flex items-center justify-center gap-2", className)}>
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{getToolActionPhrase(toolName)}...
</Text>
</div>
);
}
let parsedResult: Record<string, unknown> | null = null;
try {
parsedResult =
typeof result === "string"
? JSON.parse(result)
: (result as Record<string, unknown>);
} catch {
parsedResult = null;
}
if (parsedResult && typeof parsedResult === "object") {
const responseType = parsedResult.type as string | undefined;
if (responseType === "agent_output") {
const execution = parsedResult.execution as
| {
outputs?: Record<string, unknown[]>;
}
| null
| undefined;
const outputs = execution?.outputs || {};
const message = parsedResult.message as string | undefined;
return (
<div className={cn("space-y-4 px-4 py-2", className)}>
<div className="flex items-center gap-2">
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{getToolActionPhrase(toolName)}
</Text>
</div>
{message && (
<div className="rounded border p-4">
<Text variant="small" className="text-neutral-600">
{message}
</Text>
</div>
)}
{Object.keys(outputs).length > 0 && (
<div className="space-y-4">
{Object.entries(outputs).map(([outputName, values]) =>
values.map((value, index) => {
const renderer = globalRegistry.getRenderer(value);
if (renderer) {
return (
<OutputItem
key={`${outputName}-${index}`}
value={value}
renderer={renderer}
label={outputName}
/>
);
}
return (
<div
key={`${outputName}-${index}`}
className="rounded border p-4"
>
<Text variant="large-medium" className="mb-2 capitalize">
{outputName}
</Text>
<pre className="overflow-auto text-sm">
{JSON.stringify(value, null, 2)}
</pre>
</div>
);
}),
)}
</div>
)}
</div>
);
}
if (responseType === "block_output" && parsedResult.outputs) {
const outputs = parsedResult.outputs as Record<string, unknown[]>;
return (
<div className={cn("space-y-4 px-4 py-2", className)}>
<div className="flex items-center gap-2">
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{getToolActionPhrase(toolName)}
</Text>
</div>
<div className="space-y-4">
{Object.entries(outputs).map(([outputName, values]) =>
values.map((value, index) => {
const renderer = globalRegistry.getRenderer(value);
if (renderer) {
return (
<OutputItem
key={`${outputName}-${index}`}
value={value}
renderer={renderer}
label={outputName}
/>
);
}
return (
<div
key={`${outputName}-${index}`}
className="rounded border p-4"
>
<Text variant="large-medium" className="mb-2 capitalize">
{outputName}
</Text>
<pre className="overflow-auto text-sm">
{JSON.stringify(value, null, 2)}
</pre>
</div>
);
}),
)}
</div>
</div>
);
}
// Handle other response types with a message field (e.g., understanding_updated)
if (parsedResult.message && typeof parsedResult.message === "string") {
// Format tool name from snake_case to Title Case
const formattedToolName = toolName
.split("_")
.map((word) => word.charAt(0).toUpperCase() + word.slice(1))
.join(" ");
// Clean up message - remove incomplete user_name references
let cleanedMessage = parsedResult.message;
// Remove "Updated understanding with: user_name" pattern if user_name is just a placeholder
cleanedMessage = cleanedMessage.replace(
/Updated understanding with:\s*user_name\.?\s*/gi,
"",
);
// Remove standalone user_name references
cleanedMessage = cleanedMessage.replace(/\buser_name\b\.?\s*/gi, "");
cleanedMessage = cleanedMessage.trim();
// Only show message if it has content after cleaning
if (!cleanedMessage) {
return (
<div
className={cn(
"flex items-center justify-center gap-2 px-4 py-2",
className,
)}
>
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{formattedToolName}
</Text>
</div>
);
}
return (
<div className={cn("space-y-2 px-4 py-2", className)}>
<div className="flex items-center justify-center gap-2">
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{formattedToolName}
</Text>
</div>
<div className="rounded border p-4">
<Text variant="small" className="text-neutral-600">
{cleanedMessage}
</Text>
</div>
</div>
);
}
}
const renderer = globalRegistry.getRenderer(result);
if (renderer) {
return (
<div className={cn("px-4 py-2", className)}>
<div className="mb-2 flex items-center gap-2">
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{getToolActionPhrase(toolName)}
</Text>
</div>
<OutputItem value={result} renderer={renderer} />
</div>
);
}
return (
<div className={cn("flex items-center justify-center gap-2", className)}>
<WrenchIcon
size={14}
weight="bold"
className="flex-shrink-0 text-neutral-500"
/>
<Text variant="small" className="text-neutral-500">
{getToolActionPhrase(toolName)}...
</Text>
</div>
);
}

View File

@@ -0,0 +1,66 @@
/**
* Maps internal tool names to user-friendly display names with emojis.
* @deprecated Use getToolActionPhrase or getToolCompletionPhrase for status messages
*
* @param toolName - The internal tool name from the backend
* @returns A user-friendly display name with an emoji prefix
*/
export function getToolDisplayName(toolName: string): string {
const toolDisplayNames: Record<string, string> = {
find_agent: "🔍 Search Marketplace",
get_agent_details: "📋 Get Agent Details",
check_credentials: "🔑 Check Credentials",
setup_agent: "⚙️ Setup Agent",
run_agent: "▶️ Run Agent",
get_required_setup_info: "📝 Get Setup Requirements",
};
return toolDisplayNames[toolName] || toolName;
}
/**
* Maps internal tool names to human-friendly action phrases (present continuous).
* Used for tool call messages to indicate what action is currently happening.
*
* @param toolName - The internal tool name from the backend
* @returns A human-friendly action phrase in present continuous tense
*/
export function getToolActionPhrase(toolName: string): string {
const toolActionPhrases: Record<string, string> = {
find_agent: "Looking for agents in the marketplace",
agent_carousel: "Looking for agents in the marketplace",
get_agent_details: "Learning about the agent",
check_credentials: "Checking your credentials",
setup_agent: "Setting up the agent",
execution_started: "Running the agent",
run_agent: "Running the agent",
get_required_setup_info: "Getting setup requirements",
schedule_agent: "Scheduling the agent to run",
};
// Return mapped phrase or generate human-friendly fallback
return toolActionPhrases[toolName] || toolName;
}
/**
* Maps internal tool names to human-friendly completion phrases (past tense).
* Used for tool response messages to indicate what action was completed.
*
* @param toolName - The internal tool name from the backend
* @returns A human-friendly completion phrase in past tense
*/
export function getToolCompletionPhrase(toolName: string): string {
const toolCompletionPhrases: Record<string, string> = {
find_agent: "Finished searching the marketplace",
get_agent_details: "Got agent details",
check_credentials: "Checked credentials",
setup_agent: "Agent setup complete",
run_agent: "Agent execution started",
get_required_setup_info: "Got setup requirements",
};
// Return mapped phrase or generate human-friendly fallback
return (
toolCompletionPhrases[toolName] ||
`Finished ${toolName.replace(/_/g, " ").replace("...", "")}`
);
}

View File

@@ -1,20 +1,17 @@
"use client"; "use client";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase"; import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { useEffect, useRef, useState } from "react"; import { useEffect, useRef } from "react";
import { toast } from "sonner"; import { toast } from "sonner";
import { useChatSession } from "./useChatSession"; import { useChatSession } from "./useChatSession";
import { useChatStream } from "./useChatStream"; import { useChatStream } from "./useChatStream";
interface UseChatArgs { export function useChat() {
urlSessionId?: string | null; const hasCreatedSessionRef = useRef(false);
}
export function useChat({ urlSessionId }: UseChatArgs = {}) {
const hasClaimedSessionRef = useRef(false); const hasClaimedSessionRef = useRef(false);
const { user } = useSupabase(); const { user } = useSupabase();
const { sendMessage: sendStreamMessage } = useChatStream(); const { sendMessage: sendStreamMessage } = useChatStream();
const [showLoader, setShowLoader] = useState(false);
const { const {
session, session,
sessionId: sessionIdFromHook, sessionId: sessionIdFromHook,
@@ -22,16 +19,27 @@ export function useChat({ urlSessionId }: UseChatArgs = {}) {
isLoading, isLoading,
isCreating, isCreating,
error, error,
isSessionNotFound,
createSession, createSession,
claimSession, claimSession,
clearSession: clearSessionBase, clearSession: clearSessionBase,
loadSession, loadSession,
} = useChatSession({ } = useChatSession({
urlSessionId, urlSessionId: null,
autoCreate: false, autoCreate: false,
}); });
useEffect(
function autoCreateSession() {
if (!hasCreatedSessionRef.current && !isCreating && !sessionIdFromHook) {
hasCreatedSessionRef.current = true;
createSession().catch((_err) => {
hasCreatedSessionRef.current = false;
});
}
},
[isCreating, sessionIdFromHook, createSession],
);
useEffect( useEffect(
function autoClaimSession() { function autoClaimSession() {
if ( if (
@@ -67,17 +75,6 @@ export function useChat({ urlSessionId }: UseChatArgs = {}) {
], ],
); );
useEffect(() => {
if (isLoading || isCreating) {
const timer = setTimeout(() => {
setShowLoader(true);
}, 300);
return () => clearTimeout(timer);
} else {
setShowLoader(false);
}
}, [isLoading, isCreating]);
useEffect(function monitorNetworkStatus() { useEffect(function monitorNetworkStatus() {
function handleOnline() { function handleOnline() {
toast.success("Connection restored", { toast.success("Connection restored", {
@@ -102,6 +99,7 @@ export function useChat({ urlSessionId }: UseChatArgs = {}) {
function clearSession() { function clearSession() {
clearSessionBase(); clearSessionBase();
hasCreatedSessionRef.current = false;
hasClaimedSessionRef.current = false; hasClaimedSessionRef.current = false;
} }
@@ -111,11 +109,9 @@ export function useChat({ urlSessionId }: UseChatArgs = {}) {
isLoading, isLoading,
isCreating, isCreating,
error, error,
isSessionNotFound,
createSession, createSession,
clearSession, clearSession,
loadSession, loadSession,
sessionId: sessionIdFromHook, sessionId: sessionIdFromHook,
showLoader,
}; };
} }

View File

@@ -0,0 +1,271 @@
import {
getGetV2GetSessionQueryKey,
getGetV2GetSessionQueryOptions,
postV2CreateSession,
useGetV2GetSession,
usePatchV2SessionAssignUser,
usePostV2CreateSession,
} from "@/app/api/__generated__/endpoints/chat/chat";
import type { SessionDetailResponse } from "@/app/api/__generated__/models/sessionDetailResponse";
import { okData } from "@/app/api/helpers";
import { isValidUUID } from "@/lib/utils";
import { Key, storage } from "@/services/storage/local-storage";
import { useQueryClient } from "@tanstack/react-query";
import { useCallback, useEffect, useMemo, useRef, useState } from "react";
import { toast } from "sonner";
interface UseChatSessionArgs {
urlSessionId?: string | null;
autoCreate?: boolean;
}
export function useChatSession({
urlSessionId,
autoCreate = false,
}: UseChatSessionArgs = {}) {
const queryClient = useQueryClient();
const [sessionId, setSessionId] = useState<string | null>(null);
const [error, setError] = useState<Error | null>(null);
const justCreatedSessionIdRef = useRef<string | null>(null);
useEffect(() => {
if (urlSessionId) {
if (!isValidUUID(urlSessionId)) {
console.error("Invalid session ID format:", urlSessionId);
toast.error("Invalid session ID", {
description:
"The session ID in the URL is not valid. Starting a new session...",
});
setSessionId(null);
storage.clean(Key.CHAT_SESSION_ID);
return;
}
setSessionId(urlSessionId);
storage.set(Key.CHAT_SESSION_ID, urlSessionId);
} else {
const storedSessionId = storage.get(Key.CHAT_SESSION_ID);
if (storedSessionId) {
if (!isValidUUID(storedSessionId)) {
console.error("Invalid stored session ID:", storedSessionId);
storage.clean(Key.CHAT_SESSION_ID);
setSessionId(null);
} else {
setSessionId(storedSessionId);
}
} else if (autoCreate) {
setSessionId(null);
}
}
}, [urlSessionId, autoCreate]);
const {
mutateAsync: createSessionMutation,
isPending: isCreating,
error: createError,
} = usePostV2CreateSession();
const {
data: sessionData,
isLoading: isLoadingSession,
error: loadError,
refetch,
} = useGetV2GetSession(sessionId || "", {
query: {
enabled: !!sessionId,
select: okData,
staleTime: Infinity, // Never mark as stale
refetchOnMount: false, // Don't refetch on component mount
refetchOnWindowFocus: false, // Don't refetch when window regains focus
refetchOnReconnect: false, // Don't refetch when network reconnects
retry: 1,
},
});
const { mutateAsync: claimSessionMutation } = usePatchV2SessionAssignUser();
const session = useMemo(() => {
if (sessionData) return sessionData;
if (sessionId && justCreatedSessionIdRef.current === sessionId) {
return {
id: sessionId,
user_id: null,
messages: [],
created_at: new Date().toISOString(),
updated_at: new Date().toISOString(),
} as SessionDetailResponse;
}
return null;
}, [sessionData, sessionId]);
const messages = session?.messages || [];
const isLoading = isCreating || isLoadingSession;
useEffect(() => {
if (createError) {
setError(
createError instanceof Error
? createError
: new Error("Failed to create session"),
);
} else if (loadError) {
setError(
loadError instanceof Error
? loadError
: new Error("Failed to load session"),
);
} else {
setError(null);
}
}, [createError, loadError]);
const createSession = useCallback(
async function createSession() {
try {
setError(null);
const response = await postV2CreateSession({
body: JSON.stringify({}),
});
if (response.status !== 200) {
throw new Error("Failed to create session");
}
const newSessionId = response.data.id;
setSessionId(newSessionId);
storage.set(Key.CHAT_SESSION_ID, newSessionId);
justCreatedSessionIdRef.current = newSessionId;
setTimeout(() => {
if (justCreatedSessionIdRef.current === newSessionId) {
justCreatedSessionIdRef.current = null;
}
}, 10000);
return newSessionId;
} catch (err) {
const error =
err instanceof Error ? err : new Error("Failed to create session");
setError(error);
toast.error("Failed to create chat session", {
description: error.message,
});
throw error;
}
},
[createSessionMutation],
);
const loadSession = useCallback(
async function loadSession(id: string) {
try {
setError(null);
// Invalidate the query cache for this session to force a fresh fetch
await queryClient.invalidateQueries({
queryKey: getGetV2GetSessionQueryKey(id),
});
// Set sessionId after invalidation to ensure the hook refetches
setSessionId(id);
storage.set(Key.CHAT_SESSION_ID, id);
// Force fetch with fresh data (bypass cache)
const queryOptions = getGetV2GetSessionQueryOptions(id, {
query: {
staleTime: 0, // Force fresh fetch
retry: 1,
},
});
const result = await queryClient.fetchQuery(queryOptions);
if (!result || ("status" in result && result.status !== 200)) {
console.warn("Session not found on server, clearing local state");
storage.clean(Key.CHAT_SESSION_ID);
setSessionId(null);
throw new Error("Session not found");
}
} catch (err) {
const error =
err instanceof Error ? err : new Error("Failed to load session");
setError(error);
throw error;
}
},
[queryClient],
);
const refreshSession = useCallback(
async function refreshSession() {
if (!sessionId) {
console.log("[refreshSession] Skipping - no session ID");
return;
}
try {
setError(null);
await refetch();
} catch (err) {
const error =
err instanceof Error ? err : new Error("Failed to refresh session");
setError(error);
throw error;
}
},
[sessionId, refetch],
);
const claimSession = useCallback(
async function claimSession(id: string) {
try {
setError(null);
await claimSessionMutation({ sessionId: id });
if (justCreatedSessionIdRef.current === id) {
justCreatedSessionIdRef.current = null;
}
await queryClient.invalidateQueries({
queryKey: getGetV2GetSessionQueryKey(id),
});
await refetch();
toast.success("Session claimed successfully", {
description: "Your chat history has been saved to your account",
});
} catch (err: unknown) {
const error =
err instanceof Error ? err : new Error("Failed to claim session");
const is404 =
(typeof err === "object" &&
err !== null &&
"status" in err &&
err.status === 404) ||
(typeof err === "object" &&
err !== null &&
"response" in err &&
typeof err.response === "object" &&
err.response !== null &&
"status" in err.response &&
err.response.status === 404);
if (!is404) {
setError(error);
toast.error("Failed to claim session", {
description: error.message || "Unable to claim session",
});
}
throw error;
}
},
[claimSessionMutation, queryClient, refetch],
);
const clearSession = useCallback(function clearSession() {
setSessionId(null);
setError(null);
storage.clean(Key.CHAT_SESSION_ID);
justCreatedSessionIdRef.current = null;
}, []);
return {
session,
sessionId,
messages,
isLoading,
isCreating,
error,
createSession,
loadSession,
refreshSession,
claimSession,
clearSession,
};
}

View File

@@ -21,8 +21,6 @@ export interface StreamChunk {
timestamp?: string; timestamp?: string;
content?: string; content?: string;
message?: string; message?: string;
code?: string;
details?: Record<string, unknown>;
tool_id?: string; tool_id?: string;
tool_name?: string; tool_name?: string;
arguments?: ToolArguments; arguments?: ToolArguments;
@@ -140,18 +138,8 @@ function normalizeStreamChunk(
return { type: "stream_end" }; return { type: "stream_end" };
case "start": case "start":
case "text-start": case "text-start":
return null;
case "tool-input-start": case "tool-input-start":
const toolInputStart = chunk as Extract< return null;
VercelStreamChunk,
{ type: "tool-input-start" }
>;
return {
type: "tool_call_start",
tool_id: toolInputStart.toolCallId,
tool_name: toolInputStart.toolName,
arguments: {},
};
} }
} }
@@ -161,103 +149,22 @@ export function useChatStream() {
const retryCountRef = useRef<number>(0); const retryCountRef = useRef<number>(0);
const retryTimeoutRef = useRef<NodeJS.Timeout | null>(null); const retryTimeoutRef = useRef<NodeJS.Timeout | null>(null);
const abortControllerRef = useRef<AbortController | null>(null); const abortControllerRef = useRef<AbortController | null>(null);
const currentSessionIdRef = useRef<string | null>(null);
const requestStartTimeRef = useRef<number | null>(null);
const stopStreaming = useCallback( const stopStreaming = useCallback(() => {
(sessionId?: string, force: boolean = false) => { if (abortControllerRef.current) {
console.log("[useChatStream] stopStreaming called", { abortControllerRef.current.abort();
hasAbortController: !!abortControllerRef.current, abortControllerRef.current = null;
isAborted: abortControllerRef.current?.signal.aborted, }
currentSessionId: currentSessionIdRef.current, if (retryTimeoutRef.current) {
requestedSessionId: sessionId, clearTimeout(retryTimeoutRef.current);
requestStartTime: requestStartTimeRef.current, retryTimeoutRef.current = null;
timeSinceStart: requestStartTimeRef.current }
? Date.now() - requestStartTimeRef.current setIsStreaming(false);
: null, }, []);
force,
stack: new Error().stack,
});
if (
sessionId &&
currentSessionIdRef.current &&
currentSessionIdRef.current !== sessionId
) {
console.log(
"[useChatStream] Session changed, aborting previous stream",
{
oldSessionId: currentSessionIdRef.current,
newSessionId: sessionId,
},
);
}
const controller = abortControllerRef.current;
if (controller) {
const timeSinceStart = requestStartTimeRef.current
? Date.now() - requestStartTimeRef.current
: null;
if (!force && timeSinceStart !== null && timeSinceStart < 100) {
console.log(
"[useChatStream] Request just started (<100ms), skipping abort to prevent race condition",
{
timeSinceStart,
},
);
return;
}
try {
const signal = controller.signal;
if (
signal &&
typeof signal.aborted === "boolean" &&
!signal.aborted
) {
console.log("[useChatStream] Aborting stream");
controller.abort();
} else {
console.log(
"[useChatStream] Stream already aborted or signal invalid",
);
}
} catch (error) {
if (error instanceof Error && error.name === "AbortError") {
console.log(
"[useChatStream] AbortError caught (expected during cleanup)",
);
} else {
console.warn("[useChatStream] Error aborting stream:", error);
}
} finally {
abortControllerRef.current = null;
requestStartTimeRef.current = null;
}
}
if (retryTimeoutRef.current) {
clearTimeout(retryTimeoutRef.current);
retryTimeoutRef.current = null;
}
setIsStreaming(false);
},
[],
);
useEffect(() => { useEffect(() => {
console.log("[useChatStream] Component mounted");
return () => { return () => {
const sessionIdAtUnmount = currentSessionIdRef.current; stopStreaming();
console.log(
"[useChatStream] Component unmounting, calling stopStreaming",
{
sessionIdAtUnmount,
},
);
stopStreaming(undefined, false);
currentSessionIdRef.current = null;
}; };
}, [stopStreaming]); }, [stopStreaming]);
@@ -270,32 +177,12 @@ export function useChatStream() {
context?: { url: string; content: string }, context?: { url: string; content: string },
isRetry: boolean = false, isRetry: boolean = false,
) => { ) => {
console.log("[useChatStream] sendMessage called", { stopStreaming();
sessionId,
message: message.substring(0, 50),
isUserMessage,
isRetry,
stack: new Error().stack,
});
const previousSessionId = currentSessionIdRef.current;
stopStreaming(sessionId, true);
currentSessionIdRef.current = sessionId;
const abortController = new AbortController(); const abortController = new AbortController();
abortControllerRef.current = abortController; abortControllerRef.current = abortController;
requestStartTimeRef.current = Date.now();
console.log("[useChatStream] Created new AbortController", {
sessionId,
previousSessionId,
requestStartTime: requestStartTimeRef.current,
});
if (abortController.signal.aborted) { if (abortController.signal.aborted) {
console.warn(
"[useChatStream] AbortController was aborted before request started",
);
requestStartTimeRef.current = null;
return Promise.reject(new Error("Request aborted")); return Promise.reject(new Error("Request aborted"));
} }
@@ -323,34 +210,18 @@ export function useChatStream() {
signal: abortController.signal, signal: abortController.signal,
}); });
console.info("[useChatStream] Stream response", {
sessionId,
status: response.status,
ok: response.ok,
contentType: response.headers.get("content-type"),
});
if (!response.ok) { if (!response.ok) {
const errorText = await response.text(); const errorText = await response.text();
console.warn("[useChatStream] Stream response error", {
sessionId,
status: response.status,
errorText,
});
throw new Error(errorText || `HTTP ${response.status}`); throw new Error(errorText || `HTTP ${response.status}`);
} }
if (!response.body) { if (!response.body) {
console.warn("[useChatStream] Response body is null", { sessionId });
throw new Error("Response body is null"); throw new Error("Response body is null");
} }
const reader = response.body.getReader(); const reader = response.body.getReader();
const decoder = new TextDecoder(); const decoder = new TextDecoder();
let buffer = ""; let buffer = "";
let receivedChunkCount = 0;
let firstChunkAt: number | null = null;
let loggedLineCount = 0;
return new Promise<void>((resolve, reject) => { return new Promise<void>((resolve, reject) => {
let didDispatchStreamEnd = false; let didDispatchStreamEnd = false;
@@ -374,13 +245,6 @@ export function useChatStream() {
if (done) { if (done) {
cleanup(); cleanup();
console.info("[useChatStream] Stream closed", {
sessionId,
receivedChunkCount,
timeSinceStart: requestStartTimeRef.current
? Date.now() - requestStartTimeRef.current
: null,
});
dispatchStreamEnd(); dispatchStreamEnd();
retryCountRef.current = 0; retryCountRef.current = 0;
stopStreaming(); stopStreaming();
@@ -395,23 +259,8 @@ export function useChatStream() {
for (const line of lines) { for (const line of lines) {
if (line.startsWith("data: ")) { if (line.startsWith("data: ")) {
const data = line.slice(6); const data = line.slice(6);
if (loggedLineCount < 3) {
console.info("[useChatStream] Raw stream line", {
sessionId,
data:
data.length > 300 ? `${data.slice(0, 300)}...` : data,
});
loggedLineCount += 1;
}
if (data === "[DONE]") { if (data === "[DONE]") {
cleanup(); cleanup();
console.info("[useChatStream] Stream done marker", {
sessionId,
receivedChunkCount,
timeSinceStart: requestStartTimeRef.current
? Date.now() - requestStartTimeRef.current
: null,
});
dispatchStreamEnd(); dispatchStreamEnd();
retryCountRef.current = 0; retryCountRef.current = 0;
stopStreaming(); stopStreaming();
@@ -428,18 +277,6 @@ export function useChatStream() {
continue; continue;
} }
if (!firstChunkAt) {
firstChunkAt = Date.now();
console.info("[useChatStream] First stream chunk", {
sessionId,
chunkType: chunk.type,
timeSinceStart: requestStartTimeRef.current
? firstChunkAt - requestStartTimeRef.current
: null,
});
}
receivedChunkCount += 1;
// Call the chunk handler // Call the chunk handler
onChunk(chunk); onChunk(chunk);
@@ -447,13 +284,6 @@ export function useChatStream() {
if (chunk.type === "stream_end") { if (chunk.type === "stream_end") {
didDispatchStreamEnd = true; didDispatchStreamEnd = true;
cleanup(); cleanup();
console.info("[useChatStream] Stream end chunk", {
sessionId,
receivedChunkCount,
timeSinceStart: requestStartTimeRef.current
? Date.now() - requestStartTimeRef.current
: null,
});
retryCountRef.current = 0; retryCountRef.current = 0;
stopStreaming(); stopStreaming();
resolve(); resolve();
@@ -477,9 +307,6 @@ export function useChatStream() {
} catch (err) { } catch (err) {
if (err instanceof Error && err.name === "AbortError") { if (err instanceof Error && err.name === "AbortError") {
cleanup(); cleanup();
dispatchStreamEnd();
stopStreaming();
resolve();
return; return;
} }
@@ -525,10 +352,6 @@ export function useChatStream() {
readStream(); readStream();
}); });
} catch (err) { } catch (err) {
if (err instanceof Error && err.name === "AbortError") {
setIsStreaming(false);
return Promise.resolve();
}
const streamError = const streamError =
err instanceof Error ? err : new Error("Failed to start stream"); err instanceof Error ? err : new Error("Failed to start stream");
setError(streamError); setError(streamError);

View File

@@ -0,0 +1,27 @@
"use client";
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
import { useRouter } from "next/navigation";
import { useEffect } from "react";
import { Chat } from "./components/Chat/Chat";
export default function ChatPage() {
const isChatEnabled = useGetFlag(Flag.CHAT);
const router = useRouter();
useEffect(() => {
if (isChatEnabled === false) {
router.push("/marketplace");
}
}, [isChatEnabled, router]);
if (isChatEnabled === null || isChatEnabled === false) {
return null;
}
return (
<div className="flex h-full flex-col">
<Chat className="flex-1" />
</div>
);
}

View File

@@ -1,88 +0,0 @@
"use client";
import { LoadingSpinner } from "@/components/atoms/LoadingSpinner/LoadingSpinner";
import { NAVBAR_HEIGHT_PX } from "@/lib/constants";
import type { ReactNode } from "react";
import { DesktopSidebar } from "./components/DesktopSidebar/DesktopSidebar";
import { LoadingState } from "./components/LoadingState/LoadingState";
import { MobileDrawer } from "./components/MobileDrawer/MobileDrawer";
import { MobileHeader } from "./components/MobileHeader/MobileHeader";
import { useCopilotShell } from "./useCopilotShell";
interface Props {
children: ReactNode;
}
export function CopilotShell({ children }: Props) {
const {
isMobile,
isDrawerOpen,
isLoading,
isLoggedIn,
hasActiveSession,
sessions,
currentSessionId,
handleSelectSession,
handleOpenDrawer,
handleCloseDrawer,
handleDrawerOpenChange,
handleNewChat,
hasNextPage,
isFetchingNextPage,
fetchNextPage,
isReadyToShowContent,
} = useCopilotShell();
if (!isLoggedIn) {
return (
<div className="flex h-full items-center justify-center">
<LoadingSpinner size="large" />
</div>
);
}
return (
<div
className="flex overflow-hidden bg-[#EFEFF0]"
style={{ height: `calc(100vh - ${NAVBAR_HEIGHT_PX}px)` }}
>
{!isMobile && (
<DesktopSidebar
sessions={sessions}
currentSessionId={currentSessionId}
isLoading={isLoading}
hasNextPage={hasNextPage}
isFetchingNextPage={isFetchingNextPage}
onSelectSession={handleSelectSession}
onFetchNextPage={fetchNextPage}
onNewChat={handleNewChat}
hasActiveSession={Boolean(hasActiveSession)}
/>
)}
<div className="relative flex min-h-0 flex-1 flex-col">
{isMobile && <MobileHeader onOpenDrawer={handleOpenDrawer} />}
<div className="flex min-h-0 flex-1 flex-col">
{isReadyToShowContent ? children : <LoadingState />}
</div>
</div>
{isMobile && (
<MobileDrawer
isOpen={isDrawerOpen}
sessions={sessions}
currentSessionId={currentSessionId}
isLoading={isLoading}
hasNextPage={hasNextPage}
isFetchingNextPage={isFetchingNextPage}
onSelectSession={handleSelectSession}
onFetchNextPage={fetchNextPage}
onNewChat={handleNewChat}
onClose={handleCloseDrawer}
onOpenChange={handleDrawerOpenChange}
hasActiveSession={Boolean(hasActiveSession)}
/>
)}
</div>
);
}

View File

@@ -1,70 +0,0 @@
import type { SessionSummaryResponse } from "@/app/api/__generated__/models/sessionSummaryResponse";
import { Button } from "@/components/atoms/Button/Button";
import { Text } from "@/components/atoms/Text/Text";
import { scrollbarStyles } from "@/components/styles/scrollbars";
import { cn } from "@/lib/utils";
import { Plus } from "@phosphor-icons/react";
import { SessionsList } from "../SessionsList/SessionsList";
interface Props {
sessions: SessionSummaryResponse[];
currentSessionId: string | null;
isLoading: boolean;
hasNextPage: boolean;
isFetchingNextPage: boolean;
onSelectSession: (sessionId: string) => void;
onFetchNextPage: () => void;
onNewChat: () => void;
hasActiveSession: boolean;
}
export function DesktopSidebar({
sessions,
currentSessionId,
isLoading,
hasNextPage,
isFetchingNextPage,
onSelectSession,
onFetchNextPage,
onNewChat,
hasActiveSession,
}: Props) {
return (
<aside className="flex h-full w-80 flex-col border-r border-zinc-100 bg-zinc-50">
<div className="shrink-0 px-6 py-4">
<Text variant="h3" size="body-medium">
Your chats
</Text>
</div>
<div
className={cn(
"flex min-h-0 flex-1 flex-col overflow-y-auto px-3 py-3",
scrollbarStyles,
)}
>
<SessionsList
sessions={sessions}
currentSessionId={currentSessionId}
isLoading={isLoading}
hasNextPage={hasNextPage}
isFetchingNextPage={isFetchingNextPage}
onSelectSession={onSelectSession}
onFetchNextPage={onFetchNextPage}
/>
</div>
{hasActiveSession && (
<div className="shrink-0 bg-zinc-50 p-3 shadow-[0_-4px_6px_-1px_rgba(0,0,0,0.05)]">
<Button
variant="primary"
size="small"
onClick={onNewChat}
className="w-full"
leftIcon={<Plus width="1rem" height="1rem" />}
>
New Chat
</Button>
</div>
)}
</aside>
);
}

View File

@@ -1,15 +0,0 @@
import { Text } from "@/components/atoms/Text/Text";
import { ChatLoader } from "@/components/contextual/Chat/components/ChatLoader/ChatLoader";
export function LoadingState() {
return (
<div className="flex flex-1 items-center justify-center">
<div className="flex flex-col items-center gap-4">
<ChatLoader />
<Text variant="body" className="text-zinc-500">
Loading your chats...
</Text>
</div>
</div>
);
}

View File

@@ -1,91 +0,0 @@
import type { SessionSummaryResponse } from "@/app/api/__generated__/models/sessionSummaryResponse";
import { Button } from "@/components/atoms/Button/Button";
import { scrollbarStyles } from "@/components/styles/scrollbars";
import { cn } from "@/lib/utils";
import { PlusIcon, X } from "@phosphor-icons/react";
import { Drawer } from "vaul";
import { SessionsList } from "../SessionsList/SessionsList";
interface Props {
isOpen: boolean;
sessions: SessionSummaryResponse[];
currentSessionId: string | null;
isLoading: boolean;
hasNextPage: boolean;
isFetchingNextPage: boolean;
onSelectSession: (sessionId: string) => void;
onFetchNextPage: () => void;
onNewChat: () => void;
onClose: () => void;
onOpenChange: (open: boolean) => void;
hasActiveSession: boolean;
}
export function MobileDrawer({
isOpen,
sessions,
currentSessionId,
isLoading,
hasNextPage,
isFetchingNextPage,
onSelectSession,
onFetchNextPage,
onNewChat,
onClose,
onOpenChange,
hasActiveSession,
}: Props) {
return (
<Drawer.Root open={isOpen} onOpenChange={onOpenChange} direction="left">
<Drawer.Portal>
<Drawer.Overlay className="fixed inset-0 z-[60] bg-black/10 backdrop-blur-sm" />
<Drawer.Content className="fixed left-0 top-0 z-[70] flex h-full w-80 flex-col border-r border-zinc-200 bg-zinc-50">
<div className="shrink-0 border-b border-zinc-200 p-4">
<div className="flex items-center justify-between">
<Drawer.Title className="text-lg font-semibold text-zinc-800">
Your chats
</Drawer.Title>
<Button
variant="icon"
size="icon"
aria-label="Close sessions"
onClick={onClose}
>
<X width="1.25rem" height="1.25rem" />
</Button>
</div>
</div>
<div
className={cn(
"flex min-h-0 flex-1 flex-col overflow-y-auto px-3 py-3",
scrollbarStyles,
)}
>
<SessionsList
sessions={sessions}
currentSessionId={currentSessionId}
isLoading={isLoading}
hasNextPage={hasNextPage}
isFetchingNextPage={isFetchingNextPage}
onSelectSession={onSelectSession}
onFetchNextPage={onFetchNextPage}
/>
</div>
{hasActiveSession && (
<div className="shrink-0 bg-white p-3 shadow-[0_-4px_6px_-1px_rgba(0,0,0,0.05)]">
<Button
variant="primary"
size="small"
onClick={onNewChat}
className="w-full"
leftIcon={<PlusIcon width="1rem" height="1rem" />}
>
New Chat
</Button>
</div>
)}
</Drawer.Content>
</Drawer.Portal>
</Drawer.Root>
);
}

View File

@@ -1,24 +0,0 @@
import { useState } from "react";
export function useMobileDrawer() {
const [isDrawerOpen, setIsDrawerOpen] = useState(false);
function handleOpenDrawer() {
setIsDrawerOpen(true);
}
function handleCloseDrawer() {
setIsDrawerOpen(false);
}
function handleDrawerOpenChange(open: boolean) {
setIsDrawerOpen(open);
}
return {
isDrawerOpen,
handleOpenDrawer,
handleCloseDrawer,
handleDrawerOpenChange,
};
}

View File

@@ -1,22 +0,0 @@
import { Button } from "@/components/atoms/Button/Button";
import { NAVBAR_HEIGHT_PX } from "@/lib/constants";
import { ListIcon } from "@phosphor-icons/react";
interface Props {
onOpenDrawer: () => void;
}
export function MobileHeader({ onOpenDrawer }: Props) {
return (
<Button
variant="icon"
size="icon"
aria-label="Open sessions"
onClick={onOpenDrawer}
className="fixed z-50 bg-white shadow-md"
style={{ left: "1rem", top: `${NAVBAR_HEIGHT_PX + 20}px` }}
>
<ListIcon width="1.25rem" height="1.25rem" />
</Button>
);
}

View File

@@ -1,80 +0,0 @@
import type { SessionSummaryResponse } from "@/app/api/__generated__/models/sessionSummaryResponse";
import { Skeleton } from "@/components/__legacy__/ui/skeleton";
import { Text } from "@/components/atoms/Text/Text";
import { InfiniteList } from "@/components/molecules/InfiniteList/InfiniteList";
import { cn } from "@/lib/utils";
import { getSessionTitle } from "../../helpers";
interface Props {
sessions: SessionSummaryResponse[];
currentSessionId: string | null;
isLoading: boolean;
hasNextPage: boolean;
isFetchingNextPage: boolean;
onSelectSession: (sessionId: string) => void;
onFetchNextPage: () => void;
}
export function SessionsList({
sessions,
currentSessionId,
isLoading,
hasNextPage,
isFetchingNextPage,
onSelectSession,
onFetchNextPage,
}: Props) {
if (isLoading) {
return (
<div className="space-y-1">
{Array.from({ length: 5 }).map((_, i) => (
<div key={i} className="rounded-lg px-3 py-2.5">
<Skeleton className="h-5 w-full" />
</div>
))}
</div>
);
}
if (sessions.length === 0) {
return (
<div className="flex h-full items-center justify-center">
<Text variant="body" className="text-zinc-500">
You don&apos;t have previous chats
</Text>
</div>
);
}
return (
<InfiniteList
items={sessions}
hasMore={hasNextPage}
isFetchingMore={isFetchingNextPage}
onEndReached={onFetchNextPage}
className="space-y-1"
renderItem={(session) => {
const isActive = session.id === currentSessionId;
return (
<button
onClick={() => onSelectSession(session.id)}
className={cn(
"w-full rounded-lg px-3 py-2.5 text-left transition-colors",
isActive ? "bg-zinc-100" : "hover:bg-zinc-50",
)}
>
<Text
variant="body"
className={cn(
"font-normal",
isActive ? "text-zinc-600" : "text-zinc-800",
)}
>
{getSessionTitle(session)}
</Text>
</button>
);
}}
/>
);
}

View File

@@ -1,92 +0,0 @@
import { useGetV2ListSessions } from "@/app/api/__generated__/endpoints/chat/chat";
import type { SessionSummaryResponse } from "@/app/api/__generated__/models/sessionSummaryResponse";
import { okData } from "@/app/api/helpers";
import { useEffect, useMemo, useState } from "react";
const PAGE_SIZE = 50;
export interface UseSessionsPaginationArgs {
enabled: boolean;
}
export function useSessionsPagination({ enabled }: UseSessionsPaginationArgs) {
const [offset, setOffset] = useState(0);
const [accumulatedSessions, setAccumulatedSessions] = useState<
SessionSummaryResponse[]
>([]);
const [totalCount, setTotalCount] = useState<number | null>(null);
const { data, isLoading, isFetching, isError } = useGetV2ListSessions(
{ limit: PAGE_SIZE, offset },
{
query: {
enabled: enabled && offset >= 0,
},
},
);
useEffect(() => {
const responseData = okData(data);
if (responseData) {
const newSessions = responseData.sessions;
const total = responseData.total;
setTotalCount(total);
if (offset === 0) {
setAccumulatedSessions(newSessions);
} else {
setAccumulatedSessions((prev) => [...prev, ...newSessions]);
}
} else if (!enabled) {
setAccumulatedSessions([]);
setTotalCount(null);
}
}, [data, offset, enabled]);
const hasNextPage = useMemo(() => {
if (totalCount === null) return false;
return accumulatedSessions.length < totalCount;
}, [accumulatedSessions.length, totalCount]);
const areAllSessionsLoaded = useMemo(() => {
if (totalCount === null) return false;
return (
accumulatedSessions.length >= totalCount && !isFetching && !isLoading
);
}, [accumulatedSessions.length, totalCount, isFetching, isLoading]);
useEffect(() => {
if (
hasNextPage &&
!isFetching &&
!isLoading &&
!isError &&
totalCount !== null
) {
setOffset((prev) => prev + PAGE_SIZE);
}
}, [hasNextPage, isFetching, isLoading, isError, totalCount]);
function fetchNextPage() {
if (hasNextPage && !isFetching) {
setOffset((prev) => prev + PAGE_SIZE);
}
}
function reset() {
setOffset(0);
setAccumulatedSessions([]);
setTotalCount(null);
}
return {
sessions: accumulatedSessions,
isLoading,
isFetching,
hasNextPage,
areAllSessionsLoaded,
totalCount,
fetchNextPage,
reset,
};
}

View File

@@ -1,165 +0,0 @@
import type { SessionDetailResponse } from "@/app/api/__generated__/models/sessionDetailResponse";
import type { SessionSummaryResponse } from "@/app/api/__generated__/models/sessionSummaryResponse";
import { format, formatDistanceToNow, isToday } from "date-fns";
export function convertSessionDetailToSummary(
session: SessionDetailResponse,
): SessionSummaryResponse {
return {
id: session.id,
created_at: session.created_at,
updated_at: session.updated_at,
title: undefined,
};
}
export function filterVisibleSessions(
sessions: SessionSummaryResponse[],
): SessionSummaryResponse[] {
return sessions.filter(
(session) => session.updated_at !== session.created_at,
);
}
export function getSessionTitle(session: SessionSummaryResponse): string {
if (session.title) return session.title;
const isNewSession = session.updated_at === session.created_at;
if (isNewSession) {
const createdDate = new Date(session.created_at);
if (isToday(createdDate)) {
return "Today";
}
return format(createdDate, "MMM d, yyyy");
}
return "Untitled Chat";
}
export function getSessionUpdatedLabel(
session: SessionSummaryResponse,
): string {
if (!session.updated_at) return "";
return formatDistanceToNow(new Date(session.updated_at), { addSuffix: true });
}
export function mergeCurrentSessionIntoList(
accumulatedSessions: SessionSummaryResponse[],
currentSessionId: string | null,
currentSessionData: SessionDetailResponse | null | undefined,
): SessionSummaryResponse[] {
const filteredSessions: SessionSummaryResponse[] = [];
if (accumulatedSessions.length > 0) {
const visibleSessions = filterVisibleSessions(accumulatedSessions);
if (currentSessionId) {
const currentInAll = accumulatedSessions.find(
(s) => s.id === currentSessionId,
);
if (currentInAll) {
const isInVisible = visibleSessions.some(
(s) => s.id === currentSessionId,
);
if (!isInVisible) {
filteredSessions.push(currentInAll);
}
}
}
filteredSessions.push(...visibleSessions);
}
if (currentSessionId && currentSessionData) {
const isCurrentInList = filteredSessions.some(
(s) => s.id === currentSessionId,
);
if (!isCurrentInList) {
const summarySession = convertSessionDetailToSummary(currentSessionData);
filteredSessions.unshift(summarySession);
}
}
return filteredSessions;
}
export function getCurrentSessionId(
searchParams: URLSearchParams,
): string | null {
return searchParams.get("sessionId");
}
export function shouldAutoSelectSession(
areAllSessionsLoaded: boolean,
hasAutoSelectedSession: boolean,
paramSessionId: string | null,
visibleSessions: SessionSummaryResponse[],
accumulatedSessions: SessionSummaryResponse[],
isLoading: boolean,
totalCount: number | null,
): {
shouldSelect: boolean;
sessionIdToSelect: string | null;
shouldCreate: boolean;
} {
if (!areAllSessionsLoaded || hasAutoSelectedSession) {
return {
shouldSelect: false,
sessionIdToSelect: null,
shouldCreate: false,
};
}
if (paramSessionId) {
return {
shouldSelect: false,
sessionIdToSelect: null,
shouldCreate: false,
};
}
if (visibleSessions.length > 0) {
return {
shouldSelect: true,
sessionIdToSelect: visibleSessions[0].id,
shouldCreate: false,
};
}
if (accumulatedSessions.length === 0 && !isLoading && totalCount === 0) {
return { shouldSelect: false, sessionIdToSelect: null, shouldCreate: true };
}
if (totalCount === 0) {
return {
shouldSelect: false,
sessionIdToSelect: null,
shouldCreate: false,
};
}
return { shouldSelect: false, sessionIdToSelect: null, shouldCreate: false };
}
export function checkReadyToShowContent(
areAllSessionsLoaded: boolean,
paramSessionId: string | null,
accumulatedSessions: SessionSummaryResponse[],
isCurrentSessionLoading: boolean,
currentSessionData: SessionDetailResponse | null | undefined,
hasAutoSelectedSession: boolean,
): boolean {
if (!areAllSessionsLoaded) return false;
if (paramSessionId) {
const sessionFound = accumulatedSessions.some(
(s) => s.id === paramSessionId,
);
return (
sessionFound ||
(!isCurrentSessionLoading &&
currentSessionData !== undefined &&
currentSessionData !== null)
);
}
return hasAutoSelectedSession;
}

View File

@@ -1,170 +0,0 @@
"use client";
import {
getGetV2ListSessionsQueryKey,
useGetV2GetSession,
} from "@/app/api/__generated__/endpoints/chat/chat";
import { okData } from "@/app/api/helpers";
import { useBreakpoint } from "@/lib/hooks/useBreakpoint";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { useQueryClient } from "@tanstack/react-query";
import { usePathname, useRouter, useSearchParams } from "next/navigation";
import { useEffect, useRef, useState } from "react";
import { useMobileDrawer } from "./components/MobileDrawer/useMobileDrawer";
import { useSessionsPagination } from "./components/SessionsList/useSessionsPagination";
import {
checkReadyToShowContent,
filterVisibleSessions,
getCurrentSessionId,
mergeCurrentSessionIntoList,
} from "./helpers";
export function useCopilotShell() {
const router = useRouter();
const pathname = usePathname();
const searchParams = useSearchParams();
const queryClient = useQueryClient();
const breakpoint = useBreakpoint();
const { isLoggedIn } = useSupabase();
const isMobile =
breakpoint === "base" || breakpoint === "sm" || breakpoint === "md";
const isOnHomepage = pathname === "/copilot";
const paramSessionId = searchParams.get("sessionId");
const {
isDrawerOpen,
handleOpenDrawer,
handleCloseDrawer,
handleDrawerOpenChange,
} = useMobileDrawer();
const paginationEnabled = !isMobile || isDrawerOpen || !!paramSessionId;
const {
sessions: accumulatedSessions,
isLoading: isSessionsLoading,
isFetching: isSessionsFetching,
hasNextPage,
areAllSessionsLoaded,
fetchNextPage,
reset: resetPagination,
} = useSessionsPagination({
enabled: paginationEnabled,
});
const currentSessionId = getCurrentSessionId(searchParams);
const { data: currentSessionData, isLoading: isCurrentSessionLoading } =
useGetV2GetSession(currentSessionId || "", {
query: {
enabled: !!currentSessionId,
select: okData,
},
});
const [hasAutoSelectedSession, setHasAutoSelectedSession] = useState(false);
const hasAutoSelectedRef = useRef(false);
// Mark as auto-selected when sessionId is in URL
useEffect(() => {
if (paramSessionId && !hasAutoSelectedRef.current) {
hasAutoSelectedRef.current = true;
setHasAutoSelectedSession(true);
}
}, [paramSessionId]);
// On homepage without sessionId, mark as ready immediately
useEffect(() => {
if (isOnHomepage && !paramSessionId && !hasAutoSelectedRef.current) {
hasAutoSelectedRef.current = true;
setHasAutoSelectedSession(true);
}
}, [isOnHomepage, paramSessionId]);
// Invalidate sessions list when navigating to homepage (to show newly created sessions)
useEffect(() => {
if (isOnHomepage && !paramSessionId) {
queryClient.invalidateQueries({
queryKey: getGetV2ListSessionsQueryKey(),
});
}
}, [isOnHomepage, paramSessionId, queryClient]);
// Reset pagination when query becomes disabled
const prevPaginationEnabledRef = useRef(paginationEnabled);
useEffect(() => {
if (prevPaginationEnabledRef.current && !paginationEnabled) {
resetPagination();
resetAutoSelect();
}
prevPaginationEnabledRef.current = paginationEnabled;
}, [paginationEnabled, resetPagination]);
const sessions = mergeCurrentSessionIntoList(
accumulatedSessions,
currentSessionId,
currentSessionData,
);
const visibleSessions = filterVisibleSessions(sessions);
const sidebarSelectedSessionId =
isOnHomepage && !paramSessionId ? null : currentSessionId;
const isReadyToShowContent = isOnHomepage
? true
: checkReadyToShowContent(
areAllSessionsLoaded,
paramSessionId,
accumulatedSessions,
isCurrentSessionLoading,
currentSessionData,
hasAutoSelectedSession,
);
function handleSelectSession(sessionId: string) {
// Navigate using replaceState to avoid full page reload
window.history.replaceState(null, "", `/copilot?sessionId=${sessionId}`);
// Force a re-render by updating the URL through router
router.replace(`/copilot?sessionId=${sessionId}`);
if (isMobile) handleCloseDrawer();
}
function handleNewChat() {
resetAutoSelect();
resetPagination();
// Invalidate and refetch sessions list to ensure newly created sessions appear
queryClient.invalidateQueries({
queryKey: getGetV2ListSessionsQueryKey(),
});
window.history.replaceState(null, "", "/copilot");
router.replace("/copilot");
if (isMobile) handleCloseDrawer();
}
function resetAutoSelect() {
hasAutoSelectedRef.current = false;
setHasAutoSelectedSession(false);
}
return {
isMobile,
isDrawerOpen,
isLoggedIn,
hasActiveSession:
Boolean(currentSessionId) && (!isOnHomepage || Boolean(paramSessionId)),
isLoading: isSessionsLoading || !areAllSessionsLoaded,
sessions: visibleSessions,
currentSessionId: sidebarSelectedSessionId,
handleSelectSession,
handleOpenDrawer,
handleCloseDrawer,
handleDrawerOpenChange,
handleNewChat,
hasNextPage,
isFetchingNextPage: isSessionsFetching,
fetchNextPage,
isReadyToShowContent,
};
}

View File

@@ -1,33 +0,0 @@
import type { User } from "@supabase/supabase-js";
export function getGreetingName(user?: User | null): string {
if (!user) return "there";
const metadata = user.user_metadata as Record<string, unknown> | undefined;
const fullName = metadata?.full_name;
const name = metadata?.name;
if (typeof fullName === "string" && fullName.trim()) {
return fullName.split(" ")[0];
}
if (typeof name === "string" && name.trim()) {
return name.split(" ")[0];
}
if (user.email) {
return user.email.split("@")[0];
}
return "there";
}
export function buildCopilotChatUrl(prompt: string): string {
const trimmed = prompt.trim();
if (!trimmed) return "/copilot/chat";
const encoded = encodeURIComponent(trimmed);
return `/copilot/chat?prompt=${encoded}`;
}
export function getQuickActions(): string[] {
return [
"Show me what I can automate",
"Design a custom workflow",
"Help me with content creation",
];
}

View File

@@ -1,6 +0,0 @@
import type { ReactNode } from "react";
import { CopilotShell } from "./components/CopilotShell/CopilotShell";
export default function CopilotLayout({ children }: { children: ReactNode }) {
return <CopilotShell>{children}</CopilotShell>;
}

View File

@@ -1,228 +0,0 @@
"use client";
import { postV2CreateSession } from "@/app/api/__generated__/endpoints/chat/chat";
import { Skeleton } from "@/components/__legacy__/ui/skeleton";
import { Button } from "@/components/atoms/Button/Button";
import { LoadingSpinner } from "@/components/atoms/LoadingSpinner/LoadingSpinner";
import { Text } from "@/components/atoms/Text/Text";
import { Chat } from "@/components/contextual/Chat/Chat";
import { ChatInput } from "@/components/contextual/Chat/components/ChatInput/ChatInput";
import { getHomepageRoute } from "@/lib/constants";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import {
Flag,
type FlagValues,
useGetFlag,
} from "@/services/feature-flags/use-get-flag";
import { useFlags } from "launchdarkly-react-client-sdk";
import { useRouter, useSearchParams } from "next/navigation";
import { useEffect, useMemo, useRef, useState } from "react";
import { getGreetingName, getQuickActions } from "./helpers";
type PageState =
| { type: "welcome" }
| { type: "creating"; prompt: string }
| { type: "chat"; sessionId: string; initialPrompt?: string };
export default function CopilotPage() {
const router = useRouter();
const searchParams = useSearchParams();
const { user, isLoggedIn, isUserLoading } = useSupabase();
const isChatEnabled = useGetFlag(Flag.CHAT);
const flags = useFlags<FlagValues>();
const homepageRoute = getHomepageRoute(isChatEnabled);
const envEnabled = process.env.NEXT_PUBLIC_LAUNCHDARKLY_ENABLED === "true";
const clientId = process.env.NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID;
const isLaunchDarklyConfigured = envEnabled && Boolean(clientId);
const isFlagReady =
!isLaunchDarklyConfigured || flags[Flag.CHAT] !== undefined;
const [pageState, setPageState] = useState<PageState>({ type: "welcome" });
const initialPromptRef = useRef<Map<string, string>>(new Map());
const urlSessionId = searchParams.get("sessionId");
// Sync with URL sessionId (preserve initialPrompt from ref)
useEffect(
function syncSessionFromUrl() {
if (urlSessionId) {
// If we're already in chat state with this sessionId, don't overwrite
if (pageState.type === "chat" && pageState.sessionId === urlSessionId) {
return;
}
// Get initialPrompt from ref or current state
const storedInitialPrompt = initialPromptRef.current.get(urlSessionId);
const currentInitialPrompt =
storedInitialPrompt ||
(pageState.type === "creating"
? pageState.prompt
: pageState.type === "chat"
? pageState.initialPrompt
: undefined);
if (currentInitialPrompt) {
initialPromptRef.current.set(urlSessionId, currentInitialPrompt);
}
setPageState({
type: "chat",
sessionId: urlSessionId,
initialPrompt: currentInitialPrompt,
});
} else if (pageState.type === "chat") {
setPageState({ type: "welcome" });
}
},
[urlSessionId],
);
useEffect(
function ensureAccess() {
if (!isFlagReady) return;
if (isChatEnabled === false) {
router.replace(homepageRoute);
}
},
[homepageRoute, isChatEnabled, isFlagReady, router],
);
const greetingName = useMemo(
function getName() {
return getGreetingName(user);
},
[user],
);
const quickActions = useMemo(function getActions() {
return getQuickActions();
}, []);
async function startChatWithPrompt(prompt: string) {
if (!prompt?.trim()) return;
if (pageState.type === "creating") return;
const trimmedPrompt = prompt.trim();
setPageState({ type: "creating", prompt: trimmedPrompt });
try {
// Create session
const sessionResponse = await postV2CreateSession({
body: JSON.stringify({}),
});
if (sessionResponse.status !== 200 || !sessionResponse.data?.id) {
throw new Error("Failed to create session");
}
const sessionId = sessionResponse.data.id;
// Store initialPrompt in ref so it persists across re-renders
initialPromptRef.current.set(sessionId, trimmedPrompt);
// Update URL and show Chat with initial prompt
// Chat will handle sending the message and streaming
window.history.replaceState(null, "", `/copilot?sessionId=${sessionId}`);
setPageState({ type: "chat", sessionId, initialPrompt: trimmedPrompt });
} catch (error) {
console.error("[CopilotPage] Failed to start chat:", error);
setPageState({ type: "welcome" });
}
}
function handleQuickAction(action: string) {
startChatWithPrompt(action);
}
function handleSessionNotFound() {
router.replace("/copilot");
}
if (!isFlagReady || isChatEnabled === false || !isLoggedIn) {
return null;
}
// Show Chat when we have an active session
if (pageState.type === "chat") {
return (
<div className="flex h-full flex-col">
<Chat
key={pageState.sessionId ?? "welcome"}
className="flex-1"
urlSessionId={pageState.sessionId}
initialPrompt={pageState.initialPrompt}
onSessionNotFound={handleSessionNotFound}
/>
</div>
);
}
// Show loading state while creating session and sending first message
if (pageState.type === "creating") {
return (
<div className="flex h-full flex-1 flex-col items-center justify-center bg-[#f8f8f9] px-6 py-10">
<LoadingSpinner size="large" />
<Text variant="body" className="mt-4 text-zinc-500">
Starting your chat...
</Text>
</div>
);
}
// Show Welcome screen
const isLoading = isUserLoading;
return (
<div className="flex h-full flex-1 items-center justify-center overflow-y-auto bg-[#f8f8f9] px-6 py-10">
<div className="w-full text-center">
{isLoading ? (
<div className="mx-auto max-w-2xl">
<Skeleton className="mx-auto mb-3 h-8 w-64" />
<Skeleton className="mx-auto mb-8 h-6 w-80" />
<div className="mb-8">
<Skeleton className="mx-auto h-14 w-full rounded-lg" />
</div>
<div className="flex flex-wrap items-center justify-center gap-3">
{Array.from({ length: 4 }).map((_, i) => (
<Skeleton key={i} className="h-9 w-48 rounded-md" />
))}
</div>
</div>
) : (
<>
<div className="mx-auto max-w-2xl">
<Text
variant="h3"
className="mb-3 !text-[1.375rem] text-zinc-700"
>
Hey, <span className="text-violet-600">{greetingName}</span>
</Text>
<Text variant="h3" className="mb-8 !font-normal">
What do you want to automate?
</Text>
<div className="mb-6">
<ChatInput
onSend={startChatWithPrompt}
placeholder='You can search or just ask - e.g. "create a blog post outline"'
/>
</div>
</div>
<div className="flex flex-nowrap items-center justify-center gap-3 overflow-x-auto [-ms-overflow-style:none] [scrollbar-width:none] [&::-webkit-scrollbar]:hidden">
{quickActions.map((action) => (
<Button
key={action}
type="button"
variant="outline"
size="small"
onClick={() => handleQuickAction(action)}
className="h-auto shrink-0 border-zinc-600 !px-4 !py-2 text-[1rem] text-zinc-600"
>
{action}
</Button>
))}
</div>
</>
)}
</div>
</div>
);
}

View File

@@ -1,8 +1,6 @@
"use client"; "use client";
import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard"; import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard";
import { getHomepageRoute } from "@/lib/constants";
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
import { useSearchParams } from "next/navigation"; import { useSearchParams } from "next/navigation";
import { Suspense } from "react"; import { Suspense } from "react";
import { getErrorDetails } from "./helpers"; import { getErrorDetails } from "./helpers";
@@ -11,8 +9,6 @@ function ErrorPageContent() {
const searchParams = useSearchParams(); const searchParams = useSearchParams();
const errorMessage = searchParams.get("message"); const errorMessage = searchParams.get("message");
const errorDetails = getErrorDetails(errorMessage); const errorDetails = getErrorDetails(errorMessage);
const isChatEnabled = useGetFlag(Flag.CHAT);
const homepageRoute = getHomepageRoute(isChatEnabled);
function handleRetry() { function handleRetry() {
// Auth-related errors should redirect to login // Auth-related errors should redirect to login
@@ -29,8 +25,8 @@ function ErrorPageContent() {
window.location.reload(); window.location.reload();
}, 2000); }, 2000);
} else { } else {
// For server/network errors, go to home // For server/network errors, go to marketplace
window.location.href = homepageRoute; window.location.href = "/marketplace";
} }
} }

View File

@@ -14,6 +14,10 @@ import {
import { Dialog } from "@/components/molecules/Dialog/Dialog"; import { Dialog } from "@/components/molecules/Dialog/Dialog";
import { useEffect, useRef, useState } from "react"; import { useEffect, useRef, useState } from "react";
import { ScheduleAgentModal } from "../ScheduleAgentModal/ScheduleAgentModal"; import { ScheduleAgentModal } from "../ScheduleAgentModal/ScheduleAgentModal";
import {
AIAgentSafetyPopup,
useAIAgentSafetyPopup,
} from "./components/AIAgentSafetyPopup/AIAgentSafetyPopup";
import { ModalHeader } from "./components/ModalHeader/ModalHeader"; import { ModalHeader } from "./components/ModalHeader/ModalHeader";
import { ModalRunSection } from "./components/ModalRunSection/ModalRunSection"; import { ModalRunSection } from "./components/ModalRunSection/ModalRunSection";
import { RunActions } from "./components/RunActions/RunActions"; import { RunActions } from "./components/RunActions/RunActions";
@@ -83,8 +87,17 @@ export function RunAgentModal({
const [isScheduleModalOpen, setIsScheduleModalOpen] = useState(false); const [isScheduleModalOpen, setIsScheduleModalOpen] = useState(false);
const [hasOverflow, setHasOverflow] = useState(false); const [hasOverflow, setHasOverflow] = useState(false);
const [isSafetyPopupOpen, setIsSafetyPopupOpen] = useState(false);
const [pendingRunAction, setPendingRunAction] = useState<(() => void) | null>(
null,
);
const contentRef = useRef<HTMLDivElement>(null); const contentRef = useRef<HTMLDivElement>(null);
const { shouldShowPopup, dismissPopup } = useAIAgentSafetyPopup(
agent.has_sensitive_action,
agent.has_human_in_the_loop,
);
const hasAnySetupFields = const hasAnySetupFields =
Object.keys(agentInputFields || {}).length > 0 || Object.keys(agentInputFields || {}).length > 0 ||
Object.keys(agentCredentialsInputFields || {}).length > 0; Object.keys(agentCredentialsInputFields || {}).length > 0;
@@ -165,6 +178,24 @@ export function RunAgentModal({
onScheduleCreated?.(schedule); onScheduleCreated?.(schedule);
} }
function handleRunWithSafetyCheck() {
if (shouldShowPopup) {
setPendingRunAction(() => handleRun);
setIsSafetyPopupOpen(true);
} else {
handleRun();
}
}
function handleSafetyPopupAcknowledge() {
setIsSafetyPopupOpen(false);
dismissPopup();
if (pendingRunAction) {
pendingRunAction();
setPendingRunAction(null);
}
}
return ( return (
<> <>
<Dialog <Dialog
@@ -180,7 +211,7 @@ export function RunAgentModal({
{/* Content */} {/* Content */}
{hasAnySetupFields ? ( {hasAnySetupFields ? (
<div className="mt-4 pb-10"> <div className="mt-10 pb-32">
<RunAgentModalContextProvider <RunAgentModalContextProvider
value={{ value={{
agent, agent,
@@ -248,7 +279,7 @@ export function RunAgentModal({
)} )}
<RunActions <RunActions
defaultRunType={defaultRunType} defaultRunType={defaultRunType}
onRun={handleRun} onRun={handleRunWithSafetyCheck}
isExecuting={isExecuting} isExecuting={isExecuting}
isSettingUpTrigger={isSettingUpTrigger} isSettingUpTrigger={isSettingUpTrigger}
isRunReady={allRequiredInputsAreSet} isRunReady={allRequiredInputsAreSet}
@@ -266,6 +297,11 @@ export function RunAgentModal({
</div> </div>
</Dialog.Content> </Dialog.Content>
</Dialog> </Dialog>
<AIAgentSafetyPopup
isOpen={isSafetyPopupOpen}
onAcknowledge={handleSafetyPopupAcknowledge}
/>
</> </>
); );
} }

View File

@@ -0,0 +1,95 @@
"use client";
import { Button } from "@/components/atoms/Button/Button";
import { Text } from "@/components/atoms/Text/Text";
import { Dialog } from "@/components/molecules/Dialog/Dialog";
import { Key, storage } from "@/services/storage/local-storage";
import { ShieldCheckIcon } from "@phosphor-icons/react";
import { useCallback, useEffect, useState } from "react";
interface Props {
onAcknowledge: () => void;
isOpen: boolean;
}
export function AIAgentSafetyPopup({ onAcknowledge, isOpen }: Props) {
function handleAcknowledge() {
// Mark popup as shown so it won't appear again
storage.set(Key.AI_AGENT_SAFETY_POPUP_SHOWN, "true");
onAcknowledge();
}
if (!isOpen) return null;
return (
<Dialog
controlled={{ isOpen, set: () => {} }}
styling={{ maxWidth: "480px" }}
>
<Dialog.Content>
<div className="flex flex-col items-center p-6 text-center">
<div className="mb-6 flex h-16 w-16 items-center justify-center rounded-full bg-blue-50">
<ShieldCheckIcon
weight="fill"
size={32}
className="text-blue-600"
/>
</div>
<Text variant="h3" className="mb-4">
Safety Checks Enabled
</Text>
<Text variant="body" className="mb-2 text-zinc-700">
AI-generated agents may take actions that affect your data or
external systems.
</Text>
<Text variant="body" className="mb-8 text-zinc-700">
AutoGPT includes safety checks so you&apos;ll always have the
opportunity to review and approve sensitive actions before they
happen.
</Text>
<Button
variant="primary"
size="large"
className="w-full"
onClick={handleAcknowledge}
>
Got it
</Button>
</div>
</Dialog.Content>
</Dialog>
);
}
export function useAIAgentSafetyPopup(
hasSensitiveAction: boolean,
hasHumanInTheLoop: boolean,
) {
const [shouldShowPopup, setShouldShowPopup] = useState(false);
const [hasChecked, setHasChecked] = useState(false);
useEffect(() => {
// Only check once after mount (to avoid SSR issues)
if (hasChecked) return;
const hasSeenPopup =
storage.get(Key.AI_AGENT_SAFETY_POPUP_SHOWN) === "true";
const isRelevantAgent = hasSensitiveAction || hasHumanInTheLoop;
setShouldShowPopup(!hasSeenPopup && isRelevantAgent);
setHasChecked(true);
}, [hasSensitiveAction, hasHumanInTheLoop, hasChecked]);
const dismissPopup = useCallback(() => {
setShouldShowPopup(false);
}, []);
return {
shouldShowPopup,
dismissPopup,
};
}

View File

@@ -29,7 +29,7 @@ export function ModalHeader({ agent }: ModalHeaderProps) {
<ShowMoreText <ShowMoreText
previewLimit={400} previewLimit={400}
variant="small" variant="small"
className="mb-2 mt-4 !text-zinc-700" className="mt-4 !text-zinc-700"
> >
{agent.description} {agent.description}
</ShowMoreText> </ShowMoreText>
@@ -40,8 +40,6 @@ export function ModalHeader({ agent }: ModalHeaderProps) {
<Text variant="lead-semibold" className="text-blue-600"> <Text variant="lead-semibold" className="text-blue-600">
Tip Tip
</Text> </Text>
<div className="h-px w-full bg-blue-100" />
<Text variant="body"> <Text variant="body">
For best results, run this agent{" "} For best results, run this agent{" "}
{humanizeCronExpression( {humanizeCronExpression(
@@ -52,7 +50,7 @@ export function ModalHeader({ agent }: ModalHeaderProps) {
) : null} ) : null}
{agent.instructions ? ( {agent.instructions ? (
<div className="mt-4 flex flex-col gap-4 rounded-medium border border-purple-100 bg-[#f1ebfe80] p-4"> <div className="flex flex-col gap-4 rounded-medium border border-purple-100 bg-[#F1EBFE/5] p-4">
<Text variant="lead-semibold" className="text-purple-600"> <Text variant="lead-semibold" className="text-purple-600">
Instructions Instructions
</Text> </Text>

View File

@@ -69,7 +69,6 @@ export function SafeModeToggle({ graph, className }: Props) {
const { const {
currentHITLSafeMode, currentHITLSafeMode,
showHITLToggle, showHITLToggle,
isHITLStateUndetermined,
handleHITLToggle, handleHITLToggle,
currentSensitiveActionSafeMode, currentSensitiveActionSafeMode,
showSensitiveActionToggle, showSensitiveActionToggle,
@@ -78,20 +77,13 @@ export function SafeModeToggle({ graph, className }: Props) {
shouldShowToggle, shouldShowToggle,
} = useAgentSafeMode(graph); } = useAgentSafeMode(graph);
if (!shouldShowToggle || isHITLStateUndetermined) { if (!shouldShowToggle) {
return null;
}
const showHITL = showHITLToggle && !isHITLStateUndetermined;
const showSensitive = showSensitiveActionToggle;
if (!showHITL && !showSensitive) {
return null; return null;
} }
return ( return (
<div className={cn("flex gap-1", className)}> <div className={cn("flex gap-1", className)}>
{showHITL && ( {showHITLToggle && (
<SafeModeIconButton <SafeModeIconButton
isEnabled={currentHITLSafeMode} isEnabled={currentHITLSafeMode}
label="Human-in-the-loop" label="Human-in-the-loop"
@@ -101,7 +93,7 @@ export function SafeModeToggle({ graph, className }: Props) {
isPending={isPending} isPending={isPending}
/> />
)} )}
{showSensitive && ( {showSensitiveActionToggle && (
<SafeModeIconButton <SafeModeIconButton
isEnabled={currentSensitiveActionSafeMode} isEnabled={currentSensitiveActionSafeMode}
label="Sensitive actions" label="Sensitive actions"

View File

@@ -8,8 +8,6 @@ import { useGetV2GetUserProfile } from "@/app/api/__generated__/endpoints/store/
import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent"; import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent";
import { okData } from "@/app/api/helpers"; import { okData } from "@/app/api/helpers";
import { useToast } from "@/components/molecules/Toast/use-toast"; import { useToast } from "@/components/molecules/Toast/use-toast";
import { isLogoutInProgress } from "@/lib/autogpt-server-api/helpers";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { updateFavoriteInQueries } from "./helpers"; import { updateFavoriteInQueries } from "./helpers";
interface Props { interface Props {
@@ -25,14 +23,10 @@ export function useLibraryAgentCard({ agent }: Props) {
const { toast } = useToast(); const { toast } = useToast();
const queryClient = getQueryClient(); const queryClient = getQueryClient();
const { mutateAsync: updateLibraryAgent } = usePatchV2UpdateLibraryAgent(); const { mutateAsync: updateLibraryAgent } = usePatchV2UpdateLibraryAgent();
const { user, isLoggedIn } = useSupabase();
const logoutInProgress = isLogoutInProgress();
const { data: profile } = useGetV2GetUserProfile({ const { data: profile } = useGetV2GetUserProfile({
query: { query: {
select: okData, select: okData,
enabled: isLoggedIn && !!user && !logoutInProgress,
queryKey: ["/api/store/profile", user?.id],
}, },
}); });

View File

@@ -1,8 +1,6 @@
import { useToast } from "@/components/molecules/Toast/use-toast"; import { useToast } from "@/components/molecules/Toast/use-toast";
import { getHomepageRoute } from "@/lib/constants";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase"; import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { environment } from "@/services/environment"; import { environment } from "@/services/environment";
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
import { loginFormSchema, LoginProvider } from "@/types/auth"; import { loginFormSchema, LoginProvider } from "@/types/auth";
import { zodResolver } from "@hookform/resolvers/zod"; import { zodResolver } from "@hookform/resolvers/zod";
import { useRouter, useSearchParams } from "next/navigation"; import { useRouter, useSearchParams } from "next/navigation";
@@ -22,17 +20,15 @@ export function useLoginPage() {
const [isGoogleLoading, setIsGoogleLoading] = useState(false); const [isGoogleLoading, setIsGoogleLoading] = useState(false);
const [showNotAllowedModal, setShowNotAllowedModal] = useState(false); const [showNotAllowedModal, setShowNotAllowedModal] = useState(false);
const isCloudEnv = environment.isCloud(); const isCloudEnv = environment.isCloud();
const isChatEnabled = useGetFlag(Flag.CHAT);
const homepageRoute = getHomepageRoute(isChatEnabled);
// Get redirect destination from 'next' query parameter // Get redirect destination from 'next' query parameter
const nextUrl = searchParams.get("next"); const nextUrl = searchParams.get("next");
useEffect(() => { useEffect(() => {
if (isLoggedIn && !isLoggingIn) { if (isLoggedIn && !isLoggingIn) {
router.push(nextUrl || homepageRoute); router.push(nextUrl || "/marketplace");
} }
}, [homepageRoute, isLoggedIn, isLoggingIn, nextUrl, router]); }, [isLoggedIn, isLoggingIn, nextUrl, router]);
const form = useForm<z.infer<typeof loginFormSchema>>({ const form = useForm<z.infer<typeof loginFormSchema>>({
resolver: zodResolver(loginFormSchema), resolver: zodResolver(loginFormSchema),
@@ -102,7 +98,7 @@ export function useLoginPage() {
} else if (result.onboarding) { } else if (result.onboarding) {
router.replace("/onboarding"); router.replace("/onboarding");
} else { } else {
router.replace(homepageRoute); router.replace("/marketplace");
} }
} catch (error) { } catch (error) {
toast({ toast({

View File

@@ -5,8 +5,6 @@ import {
CarouselContent, CarouselContent,
CarouselItem, CarouselItem,
} from "@/components/__legacy__/ui/carousel"; } from "@/components/__legacy__/ui/carousel";
import { FadeIn } from "@/components/molecules/FadeIn/FadeIn";
import { StaggeredList } from "@/components/molecules/StaggeredList/StaggeredList";
import { useAgentsSection } from "./useAgentsSection"; import { useAgentsSection } from "./useAgentsSection";
import { StoreAgent } from "@/app/api/__generated__/models/storeAgent"; import { StoreAgent } from "@/app/api/__generated__/models/storeAgent";
import { StoreCard } from "../StoreCard/StoreCard"; import { StoreCard } from "../StoreCard/StoreCard";
@@ -43,14 +41,12 @@ export const AgentsSection = ({
return ( return (
<div className="flex flex-col items-center justify-center"> <div className="flex flex-col items-center justify-center">
<div className="w-full max-w-[1360px]"> <div className="w-full max-w-[1360px]">
<FadeIn direction="left" duration={0.5}> <h2
<h2 style={{ marginBottom: margin }}
style={{ marginBottom: margin }} className="font-poppins text-lg font-semibold text-[#282828] dark:text-neutral-200"
className="font-poppins text-lg font-semibold text-neutral-800 dark:text-neutral-200" >
> {sectionTitle}
{sectionTitle} </h2>
</h2>
</FadeIn>
{!displayedAgents || displayedAgents.length === 0 ? ( {!displayedAgents || displayedAgents.length === 0 ? (
<div className="text-center text-gray-500 dark:text-gray-400"> <div className="text-center text-gray-500 dark:text-gray-400">
No agents found No agents found
@@ -58,38 +54,32 @@ export const AgentsSection = ({
) : ( ) : (
<> <>
{/* Mobile Carousel View */} {/* Mobile Carousel View */}
<FadeIn direction="up" className="md:hidden"> <Carousel
<Carousel className="md:hidden"
opts={{ opts={{
loop: true, loop: true,
}} }}
>
<CarouselContent>
{displayedAgents.map((agent, index) => (
<CarouselItem key={index} className="min-w-64 max-w-71">
<StoreCard
agentName={agent.agent_name}
agentImage={agent.agent_image}
description={agent.description}
runs={agent.runs}
rating={agent.rating}
avatarSrc={agent.creator_avatar}
creatorName={agent.creator}
hideAvatar={hideAvatars}
onClick={() => handleCardClick(agent.creator, agent.slug)}
/>
</CarouselItem>
))}
</CarouselContent>
</Carousel>
</FadeIn>
{/* Desktop Grid View with Staggered Animation */}
<StaggeredList
direction="up"
staggerDelay={0.08}
className="hidden grid-cols-1 place-items-center gap-6 md:grid md:grid-cols-2 lg:grid-cols-3 2xl:grid-cols-4"
> >
<CarouselContent>
{displayedAgents.map((agent, index) => (
<CarouselItem key={index} className="min-w-64 max-w-71">
<StoreCard
agentName={agent.agent_name}
agentImage={agent.agent_image}
description={agent.description}
runs={agent.runs}
rating={agent.rating}
avatarSrc={agent.creator_avatar}
creatorName={agent.creator}
hideAvatar={hideAvatars}
onClick={() => handleCardClick(agent.creator, agent.slug)}
/>
</CarouselItem>
))}
</CarouselContent>
</Carousel>
<div className="hidden grid-cols-1 place-items-center gap-6 md:grid md:grid-cols-2 lg:grid-cols-3 2xl:grid-cols-4">
{displayedAgents.map((agent, index) => ( {displayedAgents.map((agent, index) => (
<StoreCard <StoreCard
key={index} key={index}
@@ -104,7 +94,7 @@ export const AgentsSection = ({
onClick={() => handleCardClick(agent.creator, agent.slug)} onClick={() => handleCardClick(agent.creator, agent.slug)}
/> />
))} ))}
</StaggeredList> </div>
</> </>
)} )}
</div> </div>

View File

@@ -38,7 +38,7 @@ export function BecomeACreator({
<PublishAgentModal <PublishAgentModal
trigger={ trigger={
<button className="inline-flex h-[48px] cursor-pointer items-center justify-center rounded-[38px] bg-neutral-800 px-8 py-3 transition-colors hover:bg-neutral-700 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-neutral-950 focus-visible:ring-offset-2 dark:bg-neutral-700 dark:hover:bg-neutral-600 dark:focus-visible:ring-neutral-50 md:h-[56px] md:px-10 md:py-4 lg:h-[68px] lg:px-12 lg:py-5"> <button className="inline-flex h-[48px] cursor-pointer items-center justify-center rounded-[38px] bg-neutral-800 px-8 py-3 transition-colors hover:bg-neutral-700 dark:bg-neutral-700 dark:hover:bg-neutral-600 md:h-[56px] md:px-10 md:py-4 lg:h-[68px] lg:px-12 lg:py-5">
<span className="whitespace-nowrap font-poppins text-base font-medium leading-normal text-neutral-50 md:text-lg md:leading-relaxed lg:text-xl lg:leading-7"> <span className="whitespace-nowrap font-poppins text-base font-medium leading-normal text-neutral-50 md:text-lg md:leading-relaxed lg:text-xl lg:leading-7">
{buttonText} {buttonText}
</span> </span>

View File

@@ -20,18 +20,9 @@ export const CreatorCard = ({
}: CreatorCardProps) => { }: CreatorCardProps) => {
return ( return (
<div <div
className={`h-[264px] w-full px-[18px] pb-5 pt-6 ${backgroundColor(index)} inline-flex cursor-pointer flex-col items-start justify-start gap-3.5 rounded-[26px] transition-[filter] duration-200 hover:brightness-95 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-neutral-950 focus-visible:ring-offset-2 dark:focus-visible:ring-neutral-50`} className={`h-[264px] w-full px-[18px] pb-5 pt-6 ${backgroundColor(index)} inline-flex cursor-pointer flex-col items-start justify-start gap-3.5 rounded-[26px] transition-all duration-200 hover:brightness-95`}
onClick={onClick} onClick={onClick}
onKeyDown={(e) => {
if (e.key === "Enter" || e.key === " ") {
e.preventDefault();
onClick();
}
}}
data-testid="creator-card" data-testid="creator-card"
role="button"
tabIndex={0}
aria-label={`View ${creatorName}'s profile - ${agentsUploaded} agents`}
> >
<div className="relative h-[64px] w-[64px]"> <div className="relative h-[64px] w-[64px]">
<div className="absolute inset-0 overflow-hidden rounded-full"> <div className="absolute inset-0 overflow-hidden rounded-full">

View File

@@ -1,7 +1,5 @@
"use client"; "use client";
import { FadeIn } from "@/components/molecules/FadeIn/FadeIn";
import { StaggeredList } from "@/components/molecules/StaggeredList/StaggeredList";
import { CreatorCard } from "../CreatorCard/CreatorCard"; import { CreatorCard } from "../CreatorCard/CreatorCard";
import { useFeaturedCreators } from "./useFeaturedCreators"; import { useFeaturedCreators } from "./useFeaturedCreators";
import { Creator } from "@/app/api/__generated__/models/creator"; import { Creator } from "@/app/api/__generated__/models/creator";
@@ -21,17 +19,11 @@ export const FeaturedCreators = ({
return ( return (
<div className="flex w-full flex-col items-center justify-center"> <div className="flex w-full flex-col items-center justify-center">
<div className="w-full max-w-[1360px]"> <div className="w-full max-w-[1360px]">
<FadeIn direction="left" duration={0.5}> <h2 className="mb-9 font-poppins text-lg font-semibold text-neutral-800 dark:text-neutral-200">
<h2 className="mb-9 font-poppins text-lg font-semibold text-neutral-800 dark:text-neutral-200"> {title}
{title} </h2>
</h2>
</FadeIn>
<StaggeredList <div className="grid grid-cols-1 gap-6 md:grid-cols-2 lg:grid-cols-4">
direction="up"
staggerDelay={0.1}
className="grid grid-cols-1 gap-6 md:grid-cols-2 lg:grid-cols-4"
>
{displayedCreators.map((creator, index) => ( {displayedCreators.map((creator, index) => (
<CreatorCard <CreatorCard
key={index} key={index}
@@ -43,7 +35,7 @@ export const FeaturedCreators = ({
index={index} index={index}
/> />
))} ))}
</StaggeredList> </div>
</div> </div>
</div> </div>
); );

View File

@@ -8,7 +8,6 @@ import {
CarouselNext, CarouselNext,
CarouselIndicator, CarouselIndicator,
} from "@/components/__legacy__/ui/carousel"; } from "@/components/__legacy__/ui/carousel";
import { FadeIn } from "@/components/molecules/FadeIn/FadeIn";
import Link from "next/link"; import Link from "next/link";
import { useFeaturedSection } from "./useFeaturedSection"; import { useFeaturedSection } from "./useFeaturedSection";
import { StoreAgent } from "@/app/api/__generated__/models/storeAgent"; import { StoreAgent } from "@/app/api/__generated__/models/storeAgent";
@@ -26,44 +25,40 @@ export const FeaturedSection = ({ featuredAgents }: FeaturedSectionProps) => {
return ( return (
<section className="w-full"> <section className="w-full">
<FadeIn direction="left" duration={0.5}> <h2 className="mb-8 font-poppins text-2xl font-semibold leading-7 text-neutral-800 dark:text-neutral-200">
<h2 className="mb-8 font-poppins text-2xl font-semibold leading-7 text-neutral-800 dark:text-neutral-200"> Featured agents
Featured agents </h2>
</h2>
</FadeIn>
<FadeIn direction="up" duration={0.6} delay={0.1}> <Carousel
<Carousel opts={{
opts={{ align: "center",
align: "center", containScroll: "trimSnaps",
containScroll: "trimSnaps", }}
}} >
> <CarouselContent>
<CarouselContent> {featuredAgents.map((agent, index) => (
{featuredAgents.map((agent, index) => ( <CarouselItem
<CarouselItem key={index}
key={index} className="h-[480px] md:basis-1/2 lg:basis-1/3"
className="h-[480px] md:basis-1/2 lg:basis-1/3" >
<Link
href={`/marketplace/agent/${encodeURIComponent(agent.creator)}/${encodeURIComponent(agent.slug)}`}
className="block h-full"
> >
<Link <FeaturedAgentCard
href={`/marketplace/agent/${encodeURIComponent(agent.creator)}/${encodeURIComponent(agent.slug)}`} agent={agent}
className="block h-full" backgroundColor={getBackgroundColor(index)}
> />
<FeaturedAgentCard </Link>
agent={agent} </CarouselItem>
backgroundColor={getBackgroundColor(index)} ))}
/> </CarouselContent>
</Link> <div className="relative mt-4">
</CarouselItem> <CarouselIndicator />
))} <CarouselPrevious afterClick={handlePrevSlide} />
</CarouselContent> <CarouselNext afterClick={handleNextSlide} />
<div className="relative mt-4"> </div>
<CarouselIndicator /> </Carousel>
<CarouselPrevious afterClick={handlePrevSlide} />
<CarouselNext afterClick={handleNextSlide} />
</div>
</Carousel>
</FadeIn>
</section> </section>
); );
}; };

View File

@@ -1,6 +1,6 @@
"use client"; "use client";
import { FilterChip } from "@/components/atoms/FilterChip/FilterChip"; import { Badge } from "@/components/__legacy__/ui/badge";
import { useFilterChips } from "./useFilterChips"; import { useFilterChips } from "./useFilterChips";
interface FilterChipsProps { interface FilterChipsProps {
@@ -9,6 +9,8 @@ interface FilterChipsProps {
multiSelect?: boolean; multiSelect?: boolean;
} }
// Some flaws in its logic
// FRONTEND-TODO : This needs to be fixed
export const FilterChips = ({ export const FilterChips = ({
badges, badges,
onFilterChange, onFilterChange,
@@ -20,20 +22,18 @@ export const FilterChips = ({
}); });
return ( return (
<div <div className="flex h-auto min-h-8 flex-wrap items-center justify-center gap-3 lg:min-h-14 lg:justify-start lg:gap-5">
className="flex h-auto min-h-8 flex-wrap items-center justify-center gap-3 lg:min-h-14 lg:justify-start lg:gap-5"
role="group"
aria-label="Filter options"
>
{badges.map((badge) => ( {badges.map((badge) => (
<FilterChip <Badge
key={badge} key={badge}
label={badge} variant={selectedFilters.includes(badge) ? "secondary" : "outline"}
selected={selectedFilters.includes(badge)} className="mb-2 flex cursor-pointer items-center justify-center gap-2 rounded-full border border-black/50 px-3 py-1 dark:border-white/50 lg:mb-3 lg:gap-2.5 lg:px-6 lg:py-2"
onClick={() => handleBadgeClick(badge)} onClick={() => handleBadgeClick(badge)}
size="lg" >
className="mb-2 lg:mb-3" <div className="text-sm font-light tracking-tight text-[#474747] dark:text-[#e0e0e0] lg:text-xl lg:font-medium lg:leading-9">
/> {badge}
</div>
</Badge>
))} ))}
</div> </div>
); );

View File

@@ -1,6 +1,5 @@
"use client"; "use client";
import { FadeIn } from "@/components/molecules/FadeIn/FadeIn";
import { FilterChips } from "../FilterChips/FilterChips"; import { FilterChips } from "../FilterChips/FilterChips";
import { SearchBar } from "../SearchBar/SearchBar"; import { SearchBar } from "../SearchBar/SearchBar";
import { useHeroSection } from "./useHeroSection"; import { useHeroSection } from "./useHeroSection";
@@ -10,36 +9,30 @@ export const HeroSection = () => {
return ( return (
<div className="mb-2 mt-8 flex flex-col items-center justify-center px-4 sm:mb-4 sm:mt-12 sm:px-6 md:mb-6 md:mt-16 lg:my-24 lg:px-8 xl:my-16"> <div className="mb-2 mt-8 flex flex-col items-center justify-center px-4 sm:mb-4 sm:mt-12 sm:px-6 md:mb-6 md:mt-16 lg:my-24 lg:px-8 xl:my-16">
<div className="w-full max-w-3xl lg:max-w-4xl xl:max-w-5xl"> <div className="w-full max-w-3xl lg:max-w-4xl xl:max-w-5xl">
<FadeIn direction="down" duration={0.6} delay={0}> <div className="mb-4 text-center md:mb-8">
<div className="mb-4 text-center md:mb-8"> <h1 className="text-center">
<h1 className="text-center"> <span className="font-poppins text-[48px] font-semibold leading-[54px] text-neutral-950 dark:text-neutral-50">
<span className="font-poppins text-[48px] font-semibold leading-[54px] text-neutral-950 dark:text-neutral-50"> Explore AI agents built for{" "}
Explore AI agents built for{" "} </span>
</span> <span className="font-poppins text-[48px] font-semibold leading-[54px] text-violet-600">
<span className="font-poppins text-[48px] font-semibold leading-[54px] text-violet-600"> you
you </span>
</span> <br />
<br /> <span className="font-poppins text-[48px] font-semibold leading-[54px] text-neutral-950 dark:text-neutral-50">
<span className="font-poppins text-[48px] font-semibold leading-[54px] text-neutral-950 dark:text-neutral-50"> by the{" "}
by the{" "} </span>
</span> <span className="font-poppins text-[48px] font-semibold leading-[54px] text-blue-500">
<span className="font-poppins text-[48px] font-semibold leading-[54px] text-blue-500"> community
community </span>
</span> </h1>
</h1> </div>
</div> <h3 className="mb:text-2xl mb-6 text-center font-sans text-xl font-normal leading-loose text-neutral-700 dark:text-neutral-300 md:mb-12">
</FadeIn> Bringing you AI agents designed by thinkers from around the world
<FadeIn direction="up" duration={0.6} delay={0.15}> </h3>
<h3 className="mb:text-2xl mb-6 text-center font-sans text-xl font-normal leading-loose text-neutral-700 dark:text-neutral-300 md:mb-12"> <div className="mb-4 flex justify-center sm:mb-5">
Bringing you AI agents designed by thinkers from around the world <SearchBar height="h-[74px]" />
</h3> </div>
</FadeIn> <div>
<FadeIn direction="up" duration={0.5} delay={0.3}>
<div className="mb-4 flex justify-center sm:mb-5">
<SearchBar height="h-[74px]" />
</div>
</FadeIn>
<FadeIn direction="up" duration={0.5} delay={0.4}>
<div className="flex justify-center"> <div className="flex justify-center">
<FilterChips <FilterChips
badges={searchTerms} badges={searchTerms}
@@ -47,7 +40,7 @@ export const HeroSection = () => {
multiSelect={false} multiSelect={false}
/> />
</div> </div>
</FadeIn> </div>
</div> </div>
</div> </div>
); );

View File

@@ -1,6 +1,5 @@
"use client"; "use client";
import { Separator } from "@/components/atoms/Separator/Separator"; import { Separator } from "@/components/__legacy__/ui/separator";
import { FadeIn } from "@/components/molecules/FadeIn/FadeIn";
import { FeaturedSection } from "../FeaturedSection/FeaturedSection"; import { FeaturedSection } from "../FeaturedSection/FeaturedSection";
import { BecomeACreator } from "../BecomeACreator/BecomeACreator"; import { BecomeACreator } from "../BecomeACreator/BecomeACreator";
import { HeroSection } from "../HeroSection/HeroSection"; import { HeroSection } from "../HeroSection/HeroSection";
@@ -55,13 +54,11 @@ export const MainMarkeplacePage = () => {
<FeaturedCreators featuredCreators={featuredCreators.creators} /> <FeaturedCreators featuredCreators={featuredCreators.creators} />
)} )}
<Separator className="mb-[25px] mt-[60px]" /> <Separator className="mb-[25px] mt-[60px]" />
<FadeIn direction="up" duration={0.6}> <BecomeACreator
<BecomeACreator title="Become a Creator"
title="Become a Creator" description="Join our ever-growing community of hackers and tinkerers"
description="Join our ever-growing community of hackers and tinkerers" buttonText="Become a Creator"
buttonText="Become a Creator" />
/>
</FadeIn>
</main> </main>
</div> </div>
); );

View File

@@ -16,9 +16,9 @@ interface SearchBarProps {
export const SearchBar = ({ export const SearchBar = ({
placeholder = 'Search for tasks like "optimise SEO"', placeholder = 'Search for tasks like "optimise SEO"',
backgroundColor = "bg-neutral-100 dark:bg-neutral-800", backgroundColor = "bg-neutral-100 dark:bg-neutral-800",
iconColor = "text-neutral-500 dark:text-neutral-400", iconColor = "text-[#646464] dark:text-neutral-400",
textColor = "text-neutral-500 dark:text-neutral-200", textColor = "text-[#707070] dark:text-neutral-200",
placeholderColor = "text-neutral-500 dark:text-neutral-400", placeholderColor = "text-[#707070] dark:text-neutral-400",
width = "w-9/10 lg:w-[56.25rem]", width = "w-9/10 lg:w-[56.25rem]",
height = "h-[60px]", height = "h-[60px]",
}: SearchBarProps) => { }: SearchBarProps) => {
@@ -32,13 +32,10 @@ export const SearchBar = ({
> >
<MagnifyingGlassIcon className={`h-5 w-5 md:h-7 md:w-7 ${iconColor}`} /> <MagnifyingGlassIcon className={`h-5 w-5 md:h-7 md:w-7 ${iconColor}`} />
<input <input
type="search" type="text"
name="search"
autoComplete="off"
value={searchQuery} value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)} onChange={(e) => setSearchQuery(e.target.value)}
placeholder={placeholder} placeholder={placeholder}
aria-label="Search for AI agents"
className={`flex-grow border-none bg-transparent ${textColor} font-sans text-lg font-normal leading-[2.25rem] tracking-tight md:text-xl placeholder:${placeholderColor} focus:outline-none`} className={`flex-grow border-none bg-transparent ${textColor} font-sans text-lg font-normal leading-[2.25rem] tracking-tight md:text-xl placeholder:${placeholderColor} focus:outline-none`}
data-testid="store-search-input" data-testid="store-search-input"
/> />

View File

@@ -1,25 +1,10 @@
import Image from "next/image"; import Image from "next/image";
import { Star } from "@phosphor-icons/react"; import { StarRatingIcons } from "@/components/__legacy__/ui/icons";
import Avatar, { import Avatar, {
AvatarFallback, AvatarFallback,
AvatarImage, AvatarImage,
} from "@/components/atoms/Avatar/Avatar"; } from "@/components/atoms/Avatar/Avatar";
function StarRating({ rating }: { rating: number }) {
const stars = [];
const clampedRating = Math.max(0, Math.min(5, rating));
for (let i = 1; i <= 5; i++) {
stars.push(
<Star
key={i}
weight={i <= clampedRating ? "fill" : "regular"}
className="h-4 w-4 text-neutral-900 dark:text-yellow-500"
/>,
);
}
return <>{stars}</>;
}
interface StoreCardProps { interface StoreCardProps {
agentName: string; agentName: string;
agentImage: string; agentImage: string;
@@ -49,7 +34,7 @@ export const StoreCard: React.FC<StoreCardProps> = ({
return ( return (
<div <div
className="flex h-[27rem] w-full max-w-md cursor-pointer flex-col items-start rounded-3xl bg-background transition-shadow duration-300 hover:shadow-lg focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-neutral-950 focus-visible:ring-offset-2 dark:hover:shadow-gray-700 dark:focus-visible:ring-neutral-50" className="flex h-[27rem] w-full max-w-md cursor-pointer flex-col items-start rounded-3xl bg-background transition-all duration-300 hover:shadow-lg dark:hover:shadow-gray-700"
onClick={handleClick} onClick={handleClick}
data-testid="store-card" data-testid="store-card"
role="button" role="button"
@@ -91,7 +76,7 @@ export const StoreCard: React.FC<StoreCardProps> = ({
<div className="mt-3 flex w-full flex-1 flex-col px-4"> <div className="mt-3 flex w-full flex-1 flex-col px-4">
{/* Second Section: Agent Name and Creator Name */} {/* Second Section: Agent Name and Creator Name */}
<div className="flex w-full flex-col"> <div className="flex w-full flex-col">
<h3 className="line-clamp-2 font-poppins text-2xl font-semibold text-neutral-800 dark:text-neutral-100"> <h3 className="line-clamp-2 font-poppins text-2xl font-semibold text-[#272727] dark:text-neutral-100">
{agentName} {agentName}
</h3> </h3>
{!hideAvatar && creatorName && ( {!hideAvatar && creatorName && (
@@ -122,11 +107,11 @@ export const StoreCard: React.FC<StoreCardProps> = ({
{rating.toFixed(1)} {rating.toFixed(1)}
</span> </span>
<div <div
className="inline-flex items-center gap-0.5" className="inline-flex items-center"
role="img" role="img"
aria-label={`Rating: ${rating.toFixed(1)} out of 5 stars`} aria-label={`Rating: ${rating.toFixed(1)} out of 5 stars`}
> >
<StarRating rating={rating} /> {StarRatingIcons(rating)}
</div> </div>
</div> </div>
</div> </div>

View File

@@ -3,14 +3,12 @@
import { useGetV2GetUserProfile } from "@/app/api/__generated__/endpoints/store/store"; import { useGetV2GetUserProfile } from "@/app/api/__generated__/endpoints/store/store";
import { ProfileInfoForm } from "@/components/__legacy__/ProfileInfoForm"; import { ProfileInfoForm } from "@/components/__legacy__/ProfileInfoForm";
import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard"; import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard";
import { isLogoutInProgress } from "@/lib/autogpt-server-api/helpers";
import { ProfileDetails } from "@/lib/autogpt-server-api/types"; import { ProfileDetails } from "@/lib/autogpt-server-api/types";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase"; import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { ProfileLoading } from "./ProfileLoading"; import { ProfileLoading } from "./ProfileLoading";
export default function UserProfilePage() { export default function UserProfilePage() {
const { user } = useSupabase(); const { user } = useSupabase();
const logoutInProgress = isLogoutInProgress();
const { const {
data: profile, data: profile,
@@ -20,7 +18,7 @@ export default function UserProfilePage() {
refetch, refetch,
} = useGetV2GetUserProfile<ProfileDetails | null>({ } = useGetV2GetUserProfile<ProfileDetails | null>({
query: { query: {
enabled: !!user && !logoutInProgress, enabled: !!user,
select: (res) => { select: (res) => {
if (res.status === 200) { if (res.status === 200) {
return { return {

View File

@@ -1,6 +1,5 @@
"use server"; "use server";
import { getHomepageRoute } from "@/lib/constants";
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase"; import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
import { signupFormSchema } from "@/types/auth"; import { signupFormSchema } from "@/types/auth";
import * as Sentry from "@sentry/nextjs"; import * as Sentry from "@sentry/nextjs";
@@ -12,7 +11,6 @@ export async function signup(
password: string, password: string,
confirmPassword: string, confirmPassword: string,
agreeToTerms: boolean, agreeToTerms: boolean,
isChatEnabled: boolean,
) { ) {
try { try {
const parsed = signupFormSchema.safeParse({ const parsed = signupFormSchema.safeParse({
@@ -60,9 +58,7 @@ export async function signup(
} }
const isOnboardingEnabled = await shouldShowOnboarding(); const isOnboardingEnabled = await shouldShowOnboarding();
const next = isOnboardingEnabled const next = isOnboardingEnabled ? "/onboarding" : "/";
? "/onboarding"
: getHomepageRoute(isChatEnabled);
return { success: true, next }; return { success: true, next };
} catch (err) { } catch (err) {

View File

@@ -1,8 +1,6 @@
import { useToast } from "@/components/molecules/Toast/use-toast"; import { useToast } from "@/components/molecules/Toast/use-toast";
import { getHomepageRoute } from "@/lib/constants";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase"; import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { environment } from "@/services/environment"; import { environment } from "@/services/environment";
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
import { LoginProvider, signupFormSchema } from "@/types/auth"; import { LoginProvider, signupFormSchema } from "@/types/auth";
import { zodResolver } from "@hookform/resolvers/zod"; import { zodResolver } from "@hookform/resolvers/zod";
import { useRouter, useSearchParams } from "next/navigation"; import { useRouter, useSearchParams } from "next/navigation";
@@ -22,17 +20,15 @@ export function useSignupPage() {
const [isGoogleLoading, setIsGoogleLoading] = useState(false); const [isGoogleLoading, setIsGoogleLoading] = useState(false);
const [showNotAllowedModal, setShowNotAllowedModal] = useState(false); const [showNotAllowedModal, setShowNotAllowedModal] = useState(false);
const isCloudEnv = environment.isCloud(); const isCloudEnv = environment.isCloud();
const isChatEnabled = useGetFlag(Flag.CHAT);
const homepageRoute = getHomepageRoute(isChatEnabled);
// Get redirect destination from 'next' query parameter // Get redirect destination from 'next' query parameter
const nextUrl = searchParams.get("next"); const nextUrl = searchParams.get("next");
useEffect(() => { useEffect(() => {
if (isLoggedIn && !isSigningUp) { if (isLoggedIn && !isSigningUp) {
router.push(nextUrl || homepageRoute); router.push(nextUrl || "/marketplace");
} }
}, [homepageRoute, isLoggedIn, isSigningUp, nextUrl, router]); }, [isLoggedIn, isSigningUp, nextUrl, router]);
const form = useForm<z.infer<typeof signupFormSchema>>({ const form = useForm<z.infer<typeof signupFormSchema>>({
resolver: zodResolver(signupFormSchema), resolver: zodResolver(signupFormSchema),
@@ -108,7 +104,6 @@ export function useSignupPage() {
data.password, data.password,
data.confirmPassword, data.confirmPassword,
data.agreeToTerms, data.agreeToTerms,
isChatEnabled === true,
); );
setIsLoading(false); setIsLoading(false);
@@ -134,7 +129,7 @@ export function useSignupPage() {
} }
// Prefer the URL's next parameter, then result.next (for onboarding), then default // Prefer the URL's next parameter, then result.next (for onboarding), then default
const redirectTo = nextUrl || result.next || homepageRoute; const redirectTo = nextUrl || result.next || "/";
router.replace(redirectTo); router.replace(redirectTo);
} catch (error) { } catch (error) {
setIsLoading(false); setIsLoading(false);

View File

@@ -4,12 +4,12 @@ import {
getServerAuthToken, getServerAuthToken,
} from "@/lib/autogpt-server-api/helpers"; } from "@/lib/autogpt-server-api/helpers";
import { transformDates } from "./date-transformer";
import { environment } from "@/services/environment";
import { import {
IMPERSONATION_HEADER_NAME, IMPERSONATION_HEADER_NAME,
IMPERSONATION_STORAGE_KEY, IMPERSONATION_STORAGE_KEY,
} from "@/lib/constants"; } from "@/lib/constants";
import { environment } from "@/services/environment";
import { transformDates } from "./date-transformer";
const FRONTEND_BASE_URL = const FRONTEND_BASE_URL =
process.env.NEXT_PUBLIC_FRONTEND_BASE_URL || "http://localhost:3000"; process.env.NEXT_PUBLIC_FRONTEND_BASE_URL || "http://localhost:3000";

View File

@@ -1022,7 +1022,7 @@
"get": { "get": {
"tags": ["v2", "chat", "chat"], "tags": ["v2", "chat", "chat"],
"summary": "Get Session", "summary": "Get Session",
"description": "Retrieve the details of a specific chat session.\n\nLooks up a chat session by ID for the given user (if authenticated) and returns all session data including messages.\n\nArgs:\n session_id: The unique identifier for the desired chat session.\n user_id: The optional authenticated user ID, or None for anonymous access.\n\nReturns:\n SessionDetailResponse: Details for the requested session, or None if not found.", "description": "Retrieve the details of a specific chat session.\n\nLooks up a chat session by ID for the given user (if authenticated) and returns all session data including messages.\n\nArgs:\n session_id: The unique identifier for the desired chat session.\n user_id: The optional authenticated user ID, or None for anonymous access.\n\nReturns:\n SessionDetailResponse: Details for the requested session; raises NotFoundError if not found.",
"operationId": "getV2GetSession", "operationId": "getV2GetSession",
"security": [{ "HTTPBearerJWT": [] }], "security": [{ "HTTPBearerJWT": [] }],
"parameters": [ "parameters": [
@@ -9425,6 +9425,12 @@
"type": "array", "type": "array",
"title": "Reviews", "title": "Reviews",
"description": "All reviews with their approval status, data, and messages" "description": "All reviews with their approval status, data, and messages"
},
"auto_approve_future_actions": {
"type": "boolean",
"title": "Auto Approve Future Actions",
"description": "If true, future reviews from the same blocks (nodes) being approved will be automatically approved for the remainder of this execution. This only affects the current execution run.",
"default": false
} }
}, },
"type": "object", "type": "object",

View File

@@ -141,6 +141,52 @@
} }
} }
@keyframes shimmer {
0% {
background-position: -200% 0;
}
100% {
background-position: 200% 0;
}
}
@keyframes l3 {
25% {
background-position:
0 0,
100% 100%,
100% calc(100% - 5px);
}
50% {
background-position:
0 100%,
100% 100%,
0 calc(100% - 5px);
}
75% {
background-position:
0 100%,
100% 0,
100% 5px;
}
}
.loader {
width: 80px;
height: 70px;
border: 5px solid rgb(241 245 249);
padding: 0 8px;
box-sizing: border-box;
background:
linear-gradient(rgb(15 23 42) 0 0) 0 0/8px 20px,
linear-gradient(rgb(15 23 42) 0 0) 100% 0/8px 20px,
radial-gradient(farthest-side, rgb(15 23 42) 90%, #0000) 0 5px/8px 8px
content-box,
transparent;
background-repeat: no-repeat;
animation: l3 2s infinite linear;
}
input[type="number"]::-webkit-outer-spin-button, input[type="number"]::-webkit-outer-spin-button,
input[type="number"]::-webkit-inner-spin-button { input[type="number"]::-webkit-inner-spin-button {
-webkit-appearance: none; -webkit-appearance: none;

View File

@@ -1,27 +1,5 @@
"use client"; import { redirect } from "next/navigation";
import { getHomepageRoute } from "@/lib/constants";
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
import { useRouter } from "next/navigation";
import { useEffect } from "react";
export default function Page() { export default function Page() {
const isChatEnabled = useGetFlag(Flag.CHAT); redirect("/marketplace");
const router = useRouter();
const homepageRoute = getHomepageRoute(isChatEnabled);
const envEnabled = process.env.NEXT_PUBLIC_LAUNCHDARKLY_ENABLED === "true";
const clientId = process.env.NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID;
const isLaunchDarklyConfigured = envEnabled && Boolean(clientId);
const isFlagReady =
!isLaunchDarklyConfigured || typeof isChatEnabled === "boolean";
useEffect(
function redirectToHomepage() {
if (!isFlagReady) return;
router.replace(homepageRoute);
},
[homepageRoute, isFlagReady, router],
);
return null;
} }

View File

@@ -0,0 +1,81 @@
// import { render, screen } from "@testing-library/react";
// import { describe, expect, it } from "vitest";
// import { Badge } from "./Badge";
// describe("Badge Component", () => {
// it("renders badge with content", () => {
// render(<Badge variant="success">Success</Badge>);
// expect(screen.getByText("Success")).toBeInTheDocument();
// });
// it("applies correct variant styles", () => {
// const { rerender } = render(<Badge variant="success">Success</Badge>);
// let badge = screen.getByText("Success");
// expect(badge).toHaveClass("bg-green-100", "text-green-800");
// rerender(<Badge variant="error">Error</Badge>);
// badge = screen.getByText("Error");
// expect(badge).toHaveClass("bg-red-100", "text-red-800");
// rerender(<Badge variant="info">Info</Badge>);
// badge = screen.getByText("Info");
// expect(badge).toHaveClass("bg-slate-100", "text-slate-800");
// });
// it("applies custom className", () => {
// render(
// <Badge variant="success" className="custom-class">
// Success
// </Badge>,
// );
// const badge = screen.getByText("Success");
// expect(badge).toHaveClass("custom-class");
// });
// it("renders as span element", () => {
// render(<Badge variant="success">Success</Badge>);
// const badge = screen.getByText("Success");
// expect(badge.tagName).toBe("SPAN");
// });
// it("renders children correctly", () => {
// render(
// <Badge variant="success">
// <span>Custom</span> Content
// </Badge>,
// );
// expect(screen.getByText("Custom")).toBeInTheDocument();
// expect(screen.getByText("Content")).toBeInTheDocument();
// });
// it("supports all badge variants", () => {
// const variants = ["success", "error", "info"] as const;
// variants.forEach((variant) => {
// const { unmount } = render(
// <Badge variant={variant} data-testid={`badge-${variant}`}>
// {variant}
// </Badge>,
// );
// expect(screen.getByTestId(`badge-${variant}`)).toBeInTheDocument();
// unmount();
// });
// });
// it("handles long text content", () => {
// render(
// <Badge variant="info">
// Very long text that should be handled properly by the component
// </Badge>,
// );
// const badge = screen.getByText(/Very long text/);
// expect(badge).toBeInTheDocument();
// expect(badge).toHaveClass("overflow-hidden", "text-ellipsis");
// });
// });

View File

@@ -1,151 +0,0 @@
import type { Meta, StoryObj } from "@storybook/react";
import { useState } from "react";
import { FilterChip } from "./FilterChip";
const meta: Meta<typeof FilterChip> = {
title: "Atoms/FilterChip",
component: FilterChip,
tags: ["autodocs"],
parameters: {
layout: "centered",
},
argTypes: {
size: {
control: "select",
options: ["sm", "md", "lg"],
},
},
};
export default meta;
type Story = StoryObj<typeof FilterChip>;
export const Default: Story = {
args: {
label: "Marketing",
},
};
export const Selected: Story = {
args: {
label: "Marketing",
selected: true,
},
};
export const Dismissible: Story = {
args: {
label: "Marketing",
selected: true,
dismissible: true,
},
};
export const Sizes: Story = {
render: () => (
<div className="flex items-center gap-4">
<FilterChip label="Small" size="sm" />
<FilterChip label="Medium" size="md" />
<FilterChip label="Large" size="lg" />
</div>
),
};
export const Disabled: Story = {
args: {
label: "Disabled",
disabled: true,
},
};
function FilterChipGroupDemo() {
const filters = [
"Marketing",
"Sales",
"Development",
"Design",
"Research",
"Analytics",
];
const [selected, setSelected] = useState<string[]>(["Marketing"]);
function handleToggle(filter: string) {
setSelected((prev) =>
prev.includes(filter)
? prev.filter((f) => f !== filter)
: [...prev, filter],
);
}
return (
<div className="flex flex-wrap gap-3">
{filters.map((filter) => (
<FilterChip
key={filter}
label={filter}
selected={selected.includes(filter)}
onClick={() => handleToggle(filter)}
/>
))}
</div>
);
}
export const FilterGroup: Story = {
render: () => <FilterChipGroupDemo />,
};
function SingleSelectDemo() {
const filters = ["All", "Featured", "Popular", "New"];
const [selected, setSelected] = useState("All");
return (
<div className="flex flex-wrap gap-3">
{filters.map((filter) => (
<FilterChip
key={filter}
label={filter}
selected={selected === filter}
onClick={() => setSelected(filter)}
/>
))}
</div>
);
}
export const SingleSelect: Story = {
render: () => <SingleSelectDemo />,
};
function DismissibleDemo() {
const [filters, setFilters] = useState([
"Marketing",
"Sales",
"Development",
]);
function handleDismiss(filter: string) {
setFilters((prev) => prev.filter((f) => f !== filter));
}
return (
<div className="flex flex-wrap gap-3">
{filters.map((filter) => (
<FilterChip
key={filter}
label={filter}
selected
dismissible
onDismiss={() => handleDismiss(filter)}
/>
))}
{filters.length === 0 && (
<span className="text-neutral-500">No filters selected</span>
)}
</div>
);
}
export const DismissibleGroup: Story = {
render: () => <DismissibleDemo />,
};

Some files were not shown because too many files have changed in this diff Show More