Skip to content

Commit

Permalink
Typo
Browse files Browse the repository at this point in the history
  • Loading branch information
aymeric-roucher committed Sep 9, 2024
1 parent 23be72c commit 9b66af0
Showing 1 changed file with 35 additions and 35 deletions.
70 changes: 35 additions & 35 deletions notebooks/en/multiagent_web_assistant.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -14,20 +14,21 @@
"It will be a simple hierarchy, using a `ManagedAgent` object to wrap the managed web search agent:\n",
"\n",
"```\n",
" +----------------+\n",
" | Manager agent |\n",
" +----------------+\n",
" |\n",
" ________|_________________\n",
" | |\n",
" Code interpreter +----------------------+\n",
" tool | Managed agent |\n",
" | +------------------+ |\n",
" | | Web Search agent | |\n",
" | +------------------+ |\n",
" | | |\n",
" | Web Search tool |\n",
" +----------------------+\n",
" +----------------+\n",
" | Manager agent |\n",
" +----------------+\n",
" |\n",
" _______________|______________\n",
" | |\n",
" Code interpreter +--------------------------------+\n",
" tool | Managed agent |\n",
" | +------------------+ |\n",
" | | Web Search agent | |\n",
" | +------------------+ |\n",
" | | | |\n",
" | Web Search tool | |\n",
" | Visit webpage tool |\n",
" +--------------------------------+\n",
"```\n",
"Let's set up this system. \n",
"\n",
Expand Down Expand Up @@ -82,24 +83,16 @@
"For web browsing, we can already use our pre-existing [`DuckDuckGoSearchTool`](https://github.com/huggingface/transformers/blob/main/src/transformers/agents/search.py) tool to provide a Google search equivalent.\n",
"\n",
"But then we will also need to be able to peak into page found by the `DuckDuckGoSearchTool`.\n",
"To do so, we could import the library's built-in `VisitWebpageTool`, but we will build it again to see how it's done.\n",
"\n",
"So for this, let's create a new tool using `markdownify`."
"So let's create our `VisitWebpageTool` tool from scratch using `markdownify`."
]
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.\n",
"None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.\n"
]
}
],
"outputs": [],
"source": [
"from transformers import Tool\n",
"import requests\n",
Expand All @@ -108,9 +101,9 @@
"import re\n",
"\n",
"\n",
"class VisitPageTool(Tool):\n",
"class VisitWebpageTool(Tool):\n",
" name = \"visit_webpage\"\n",
" description = \"Visits a wbepage at the given url and returns its content as a markdown string.\"\n",
" description = \"Visits a webpage at the given url and returns its content as a markdown string.\"\n",
" inputs = {\n",
" \"url\": {\n",
" \"type\": \"text\",\n",
Expand Down Expand Up @@ -476,7 +469,7 @@
}
],
"source": [
"visit_page_tool = VisitPageTool()\n",
"visit_page_tool = VisitWebpageTool()\n",
"\n",
"print(visit_page_tool(\"https://en.wikipedia.org/wiki/Hugging_Face\"))"
]
Expand All @@ -487,11 +480,11 @@
"source": [
"## Build our multi-agent system 🤖🤝🤖\n",
"\n",
"First, we create the web agent, with our two web browsing tools : `search` and `visit_page`.\n",
"Now that we have all the tools `search` and `visit_webpage`, we create use them to create the web agent.\n",
"\n",
"Which configuration to choose for this one?\n",
"- We make it a `ReactJsonAgent`, since web browsing is a single-timeline task that does not require parallel tool calls, so JSON tool calling works well for that.\n",
"- Also, since sometimes web search requires exploring many pages before finding the correct answer, we prefer to increase the number of `max_iterations`"
"Which configuration to choose for this agent?\n",
"- Web browsing is a single-timeline task that does not require parallel tool calls, so JSON tool calling works well for that. We thus choose a `ReactJsonAgent`.\n",
"- Also, since sometimes web search requires exploring many pages before finding the correct answer, we prefer to increase the number of `max_iterations` to 10."
]
},
{
Expand All @@ -511,12 +504,19 @@
"llm_engine = HfApiEngine(model)\n",
"\n",
"web_agent = ReactJsonAgent(\n",
" tools=[DuckDuckGoSearchTool(), VisitPageTool()],\n",
" tools=[DuckDuckGoSearchTool(), VisitWebpageTool()],\n",
" llm_engine=llm_engine,\n",
" max_iterations=10,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We then wrap this agent into a `ManagedAgent` that will make it callable by its manager agent."
]
},
{
"cell_type": "code",
"execution_count": 6,
Expand Down Expand Up @@ -706,7 +706,7 @@
"\n",
"🤔💭 One could even think of doing more complex, tree-like hierarchies, with one CEO agent handling multiple middle managers, each with several reports.\n",
"\n",
"We could even add more intermediate layers, and each one adds a bit more friction to ensure the tasks never get done... Ehm wait, no, let's stick with our simple structure."
"We could even add more intermediate layers of management, each with multiple daily meetings, lots of agile stuff with scrum masters, and each new component adds enough friction to ensure the tasks never get done... Ehm wait, no, let's stick with our simple structure."
]
}
],
Expand Down

0 comments on commit 9b66af0

Please sign in to comment.