A2UI Launched: Full CopilotKit support at launch!

A2UI Launched: CopilotKit has partnered with Google to deliver full support in both CopilotKit and AG-UI!

Check it out
LogoLogo
  • Overview
  • Integrations
  • API Reference
  • Copilot Cloud
Slanted end borderSlanted end border
Slanted start borderSlanted start border
Select integration...

Please select an integration to view the sidebar content.

Streaming and Tool Calls

CoAgents support streaming your messages and tool calls to the frontend.

If you'd like to stream messages from your CrewAI agents you can utilize our Copilotkit SDK which provides a collection of functions and utilities for interacting with the agent's state or behavior. This allows you to choose how messages and tool calls are emitted and streamed to the frontend.

Message Streaming

If you just call the LiteLLM completion function from your CrewAI agent, messages will not be streamed by default. To stream messages, wrap the completion function with the copilotkit_stream function. This will enable streaming of both the messages and tool calls to the frontend.

response = copilotkit_stream(
    completion(
        model="openai/gpt-4o",
        messages=[
            {"role": "system", "content": my_prompt},
            *self.state["messages"]
        ],
        stream=True
    )
)

For more information on how tool calls are utilized check out our frontend actions documentation.

On this page

Message Streaming