DeerFlow 2.0 AI Workflow Automation | Optimized for Complex Tasks

ByteDance has introduced an updated version of DeerFlow.

DeerFlow 2.0 is a completely redesigned project built from scratch, significantly different from the first version. The original used a framework for deep research, whereas now a full runtime for agents has been developed.

The core of the system is a combination of LangGraph and LangChain.

The main agent receives a given task, breaks it down into subtasks, and dynamically creates sub-agents. All of them operate in isolated environments: they do not have access to each other’s data or to the main process.

These sub-agents run in parallel where possible and return structured results. Ultimately, the main agent compiles the final output based on the received information.

The entire session runs inside a dedicated Docker container with a full filesystem, where both the main agent and sub-agents operate.

The agent can read and write files, execute bash commands, and process images. There is no confusion between different sessions — everything remains separated and organized.

Regarding skills and tools

Agent capabilities are defined through so-called Skills. The basic package includes functions for conducting research, generating reports, creating presentations, web pages, images, and videos. Skills are connected only as needed — when a specific task requires them. This helps reduce the load on the context window and simplifies working with models that are sensitive to token consumption.

Tools operate on the same principle: a set of standard options (web search, data upload, file handling, bash), as well as support for MCP servers and any Python functions. All of this can be easily extended or replaced to suit user requirements.

On memory and context management

DeerFlow remembers users across sessions: a profile is created — writing style, technical stack, recurring scenarios. All data is stored locally.

During long sessions, the system manages the context size itself: completed subtasks are accumulated, and their results are saved to disk. This helps prevent memory window overload.

Integrations with Telegram, Slack, and Feishu are supported. Through Claude Code, users can directly interact with a running DeerFlow — send tasks, manage workflows, or switch execution modes via a dedicated skill.

Regarding models and deployment

DeerFlow can work with any models via OpenAI API or local solutions like Ollama. It is recommended to use models with long context support (over 100,000 tokens), logical reasoning capabilities (risone), multimodality, and reliable tool interaction.

DeerFlow can also be integrated as a Python library without needing to run separate HTTP services:

“`python
from src.client import DeerFlowClient
client = DeerFlowClient()
response = client.chat(“Analyze this document”, thread_id=”my-thread”)
“`

License — MIT License.

A public GitHub repository is available for demonstration.

This system is ideal for automating complex AI workflows across various fields.

Created with n8n:
https://cutt.ly/n8n

Created with syllaby:
https://cutt.ly/syllaby

Page view 18.03 10:56 Page view 18.03 10:54 Page view 18.03 10:41 Page view 18.03 10:41 Page view /ai-blog/venezuela-leader-nicolas-maduro-detained-in-us-breaking-news/ 18.03 10:35 Page view /ai-blog/venezuela-leader-nicolas-maduro-detained-in-us-breaking-news/ 18.03 10:34 Page view 18.03 10:32 Page view /ai-blog/ai-coding-revolution-future-proof-your-business-strategies/ 18.03 10:20 Page view /ai-blog/google-workspace-studio-simplify-ai-agent-creation-management/ 18.03 10:20 Page view 18.03 10:19