Tool Use & Agents

Tool use enables developers to build agentic applications that connect to external tools, do reasoning, and perform actions.

The Chat endpoint comes with built-in tool use capabilities such as function calling, multi-step reasoning, and citation generation.

This quickstart guide shows you how to utilize tool use with the Chat endpoint.

1

Setup

First, install the Cohere Python SDK with the following command.

$pip install -U cohere

Next, import the library and create a client.

PYTHON
1import cohere
2
3co = cohere.Client(
4 "COHERE_API_KEY"
5) # Get your free API key here: https://dashboard.cohere.com/api-keys
2

Tool Definition

First, we need to set up the tools. A tool can be any function or service that can receive and send objects.

We also need to define the tool schemas in a format that can be passed to the Chat endpoint. The schema must contain the following fields: name, description, and parameter_definitions.

PYTHON
1def get_weather(location):
2 # Implement your tool calling logic here
3 return {"temperature": "20C"}
4
5
6functions_map = {"get_weather": get_weather}
7
8tools = [
9 {
10 "name": "get_weather",
11 "description": "Gets the weather of a given location",
12 "parameter_definitions": {
13 "location": {
14 "description": "The location to get weather, example: San Francisco, CA",
15 "type": "str",
16 "required": True,
17 }
18 },
19 },
20]
3

Tool Calling

Next, pass the tool schema to the Chat endpoint together with the user message.

The LLM will then generate the tool calls (if any) and return the tool_calls object.

PYTHON
1message = "What's the weather in Toronto?"
2
3response = co.chat(
4 model="command-r-plus-08-2024", message=message, tools=tools
5)
6
7print(response.tool_calls)
1[ToolCall(name='get_weather', parameters={'location': 'Toronto'})]
4

Tool Execution

Next, the tools called will execute based on the parameters generated in the tool calling step earlier.

PYTHON
1tool_content = []
2if response.tool_calls:
3 for tc in response.tool_calls:
4 tool_call = {"name": tc.name, "parameters": tc.parameters}
5 tool_result = functions_map[tc.name](**tc.parameters)
6 tool_content.append(
7 {"call": tool_call, "outputs": [tool_result]}
8 )
5

Response Generation

The results are passed back to the LLM, which generates the final response.

PYTHON
1response = co.chat(
2 model="command-r-plus-08-2024",
3 message="",
4 tools=tools,
5 tool_results=tool_content,
6 chat_history=response.chat_history,
7)
8
9print(response.text)
1It is 20C in Toronto.
6

Citation Generation

The response object contains a citations field, which contains specific text spans from the documents on which the response is grounded.

PYTHON
1if response.citations:
2 for citation in response.citations:
3 print(citation, "\n")
1start=6 end=9 text='20C' document_ids=['get_weather:0:2:0']

Further Resources

Built with