Building a Chatbot with Cohere

Open in Colab

As its name implies, the Chat endpoint enables developers to build chatbots that can handle conversations. At the core of a conversation is a multi-turn dialog between the user and the chatbot. This requires the chatbot to have the state (or “memory”) of all the previous turns to maintain the state of the conversation.

In this tutorial, you’ll learn about:

  • Sending messages to the model
  • Crafting a system message
  • Maintaining conversation state

You’ll learn these by building an onboarding assistant for new hires.

Setup

To get started, first we need to install the cohere library and create a Cohere client.

PYTHON
1# pip install cohere
2
3import cohere
4
5# Get your free API key: https://dashboard.cohere.com/api-keys
6co = cohere.ClientV2(api_key="COHERE_API_KEY")

Sending messages to the model

We will use the Cohere Chat API to send messages and genereate responses from the model. The required inputs to the Chat endpoint are the model (the model name) and messages (a list of messages in chronological order). In the example below, we send a single message to the model command-a-03-2025:

PYTHON
1response = co.chat(
2 model="command-a-03-2025",
3 messages=[
4 {
5 "role": "user",
6 "content": "I'm joining a new startup called Co1t today. Could you help me write a short introduction message to my teammates.",
7 },
8 ],
9)
10
11print(response.message)

Notice that in addition to the message “content”, there is also a field titled “role”. Messages with the role “user” represent prompts from the user interacting with the chatbot. Responses from model will always have a message with the role “assistant”. Below is the response message from the API:

1{
2 role='assistant',
3 content=[
4 {
5 type='text',
6 text='Absolutely! Here’s a warm and professional introduction message you can use to connect with your new teammates at Co1t:\n\n---\n\n**Subject:** Excited to Join the Co1t Team! \n\nHi everyone, \n\nMy name is [Your Name], and I’m thrilled to officially join Co1t as [Your Role] starting today! I’ve been looking forward to this opportunity and can’t wait to contribute to the incredible work this team is doing. \n\nA little about me: [Share a brief personal or professional detail, e.g., "I’ve spent the last few years working in [industry/field], and I’m passionate about [specific skill or interest]." or "Outside of work, I love [hobby or interest] and am always up for a good [book/podcast/movie] recommendation!"] \n\nI’m excited to get to know each of you, learn from your experiences, and collaborate on driving Co1t’s mission forward. Please feel free to reach out—I’d love to chat and hear more about your roles and what you’re working on. \n\nLooking forward to an amazing journey together! \n\nBest regards, \n[Your Name] \n[Your Role] \nCo1t \n\n---\n\nFeel free to customize it further to match your style and the culture of Co1t. Good luck on your first day! 🚀'
7 }
8 ],
9}

Crafting a system message

When building a chatbot, it may be useful to constrain its behavior. For example, you may want to prevent the assistant from responding to certain prompts, or force it to respond in a desired tone. To achieve this, you can include a message with the role “system” in the messages array. Instructions in system messages always take precedence over instructions in user messages, so as a developer you have control over the chatbot behavior.

For example, if we want the chatbot to adopt a formal style, the system instruction can be used to encourage the generation of more business-like and professional responses. We can also instruct the chatbot to refuse requests that are unrelated to onboarding. When writing a system message, the recommended approach is to use two H2 Markdown headers: “Task and Context” and “Style Guide” in the exact order.

In the example below, the system instruction provides context for the assistant’s task (task and context) and encourages the generation of rhymes as much as possible (style guide).

PYTHON
1# Create a custom system instruction that guides all of the Assistant's responses
2system_instruction = """## Task and Context
3You assist new employees of Co1t with their first week of onboarding at Co1t, a startup founded by Mr. Colt.
4If the user asks any questions unrelated to onboarding, politely refuse to answer them.
5
6## Style Guide
7Try to speak in rhymes as much as possible. Be professional."""
8
9# Send messages to the model
10response = co.chat(
11 model="command-a-03-2025",
12 messages=[
13 {"role": "system", "content": system_instruction},
14 {
15 "role": "user",
16 "content": "I'm joining a new startup called Co1t today. Could you help me write a short introduction message to my teammates.",
17 },
18 ],
19)
20
21print(response.message.content[0].text)
Sure, here's a rhyme to break the ice,
A warm welcome to the team, so nice,
Hi, I'm [Your Name], a new face,
Ready to join the Co1t space,
A journey begins, a path unknown,
But together we'll make our mark, a foundation stone,
Excited to learn and contribute my part,
Let's create, innovate, and leave a lasting art,
Looking forward to our adventures yet untold,
With teamwork and passion, let's achieve our goals!
Cheers to a great start!
Your enthusiastic new mate.

Maintaining conversation state

Conversations with your chatbot will often span more than one turn. In order to not lose context of previous turns, the entire chat history will need to be passed in the messages array when making calls with the Chat API. In the example below, we keep adding “assistant” and “user” messages to the messages array to build up the chat history over multiple turns:

PYTHON
1messages = [
2 {"role": "system", "content": system_instruction},
3]
4
5# user turn 1
6messages.append(
7 {
8 "role": "user",
9 "content": "I'm joining a new startup called Co1t today. Could you help me write a short introduction message to my teammates.",
10 },
11)
12response = co.chat(
13 model="command-a-03-2025",
14 messages=messages,
15)
16
17# assistant turn 1
18messages.append(
19 response.message
20) # add the Assistant message to the messages array to include it in the chat history for the next turn
21
22# user turn 2
23messages.append({"role": "user", "content": "Who founded co1t?"})
24
25response = co.chat(
26 model="command-a-03-2025",
27 messages=messages,
28)
29
30# assistant turn 2
31messages.append(response.message)
32
33print(response.message.content[0].text)

You will use the same method for running a multi-turn conversation when you learn about other use cases such as RAG (Part 6) and tool use (Part 7).

But to fully leverage these other capabilities, you will need another type of language model that generates text representations, or embeddings.

In Part 4, you will learn how text embeddings can power an important use case for RAG, which is semantic search.