Handoff termination and show how to use it for asking user input (#4128)

* Handoff termination and show how to use it for asking user input

* lint
This commit is contained in:
Eric Zhu
2024-11-10 21:38:52 -08:00
committed by GitHub
parent 9f175089c5
commit 4786f189bc
4 changed files with 204 additions and 6 deletions

View File

@@ -49,7 +49,7 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
@@ -404,8 +404,6 @@
}
],
"source": [
"from autogen_agentchat.task import Console\n",
"\n",
"# Use `asyncio.run(Console(reflection_team.run_stream(task=\"Write a short poem about fall season.\")))` when running in a script.\n",
"await Console(\n",
" reflection_team.run_stream(task=\"Write a short poem about fall season.\")\n",
@@ -593,6 +591,141 @@
"# Use the `asyncio.run(Console(reflection_team.run_stream()))` when running in a script.\n",
"await Console(reflection_team.run_stream())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Pause for User Input\n",
"\n",
"Often times, team needs additional input from the application (i.e., user)\n",
"to continue processing the task.\n",
"You can use the {py:class}`~autogen_agentchat.task.HandoffTermination` termination condition\n",
"to stop the team when an agent sends a {py:class}`~autogen_agentchat.messages.HandoffMessage` message.\n",
"\n",
"Let's create a team with a single {py:class}`~autogen_agentchat.agents.AssistantAgent` agent\n",
"with a handoff setting.\n",
"\n",
"```{note}\n",
"The model used with {py:class}`~autogen_agentchat.agents.AssistantAgent`must support tool call\n",
"to use the handoff feature.\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [],
"source": [
"from autogen_agentchat.agents import AssistantAgent, Handoff\n",
"from autogen_agentchat.task import HandoffTermination, TextMentionTermination\n",
"from autogen_agentchat.teams import RoundRobinGroupChat\n",
"from autogen_ext.models import OpenAIChatCompletionClient\n",
"\n",
"# Create an OpenAI model client.\n",
"model_client = OpenAIChatCompletionClient(\n",
" model=\"gpt-4o-2024-08-06\",\n",
" # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY env variable set.\n",
")\n",
"\n",
"# Create a lazy assistant agent that always hands off to the user.\n",
"lazy_agent = AssistantAgent(\n",
" \"lazy_assistant\",\n",
" model_client=model_client,\n",
" handoffs=[Handoff(target=\"user\", message=\"Transfer to user.\")],\n",
" system_message=\"Always transfer to user when you don't know the answer. Respond 'TERMINATE' when task is complete.\",\n",
")\n",
"\n",
"# Define a termination condition that checks for handoff message targetting helper and text \"TERMINATE\".\n",
"handoff_termination = HandoffTermination(target=\"user\")\n",
"text_termination = TextMentionTermination(\"TERMINATE\")\n",
"termination = handoff_termination | text_termination\n",
"\n",
"# Create a single-agent team.\n",
"lazy_agent_team = RoundRobinGroupChat([lazy_agent], termination_condition=termination)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's run the team with a task that requires additional input from the user\n",
"because the agent doesn't have relevant tools to continue processing the task."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"---------- user ----------\n",
"What is the weather in New York?\n",
"---------- lazy_assistant ----------\n",
"[FunctionCall(id='call_YHm4KPjFIWZE95YrJWlJwcv4', arguments='{}', name='transfer_to_user')]\n",
"[Prompt tokens: 68, Completion tokens: 11]\n",
"---------- lazy_assistant ----------\n",
"[FunctionExecutionResult(content='Transfer to user.', call_id='call_YHm4KPjFIWZE95YrJWlJwcv4')]\n",
"---------- lazy_assistant ----------\n",
"Transfer to user.\n",
"---------- Summary ----------\n",
"Number of messages: 4\n",
"Finish reason: Handoff to user from lazy_assistant detected.\n",
"Total prompt tokens: 68\n",
"Total completion tokens: 11\n",
"Duration: 0.73 seconds\n"
]
}
],
"source": [
"from autogen_agentchat.task import Console\n",
"\n",
"# Use `asyncio.run(Console(lazy_agent_team.run_stream(task=\"What is the weather in New York?\")))` when running in a script.\n",
"await Console(lazy_agent_team.run_stream(task=\"What is the weather in New York?\"))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can see the team stopped due to the handoff message was detected.\n",
"Let's continue the team by providing the information the agent needs."
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"---------- user ----------\n",
"It is raining in New York.\n",
"---------- lazy_assistant ----------\n",
"I hope you stay dry! Is there anything else you would like to know or do?\n",
"[Prompt tokens: 108, Completion tokens: 19]\n",
"---------- lazy_assistant ----------\n",
"TERMINATE\n",
"[Prompt tokens: 134, Completion tokens: 4]\n",
"---------- Summary ----------\n",
"Number of messages: 3\n",
"Finish reason: Text 'TERMINATE' mentioned\n",
"Total prompt tokens: 242\n",
"Total completion tokens: 23\n",
"Duration: 6.77 seconds\n"
]
}
],
"source": [
"# Use `asyncio.run(Console(lazy_agent_team.run_stream(task=\"It is raining in New York.\")))` when running in a script.\n",
"await Console(lazy_agent_team.run_stream(task=\"It is raining in New York.\"))"
]
}
],
"metadata": {