Today, we’re releasing Flows AI - a lightweight and minimal library for building agent workflows on top of Vercel AI SDK.
The Inspiration Behind Flows AI
The world of AI libraries often begins in Python. Yet despite its widespread adoption across cloud platforms, the JavaScript ecosystem still lacks a well-established, comparable solution for constructing agentic workflows. While libraries like Vercel AI SDK provide a great foundation, many solutions for orchestrating AI agents feel either overly complicated or underwhelming.
We ran down this rabbit hole last year with Fabrice - our first attempt at figuring out agentic systems at scale. The goal for Fabrice was simple: take workflow description in plain English, and by using available agents, plan and execute the work autonomously.
This abstraction showed two weak points when taken into production: non-deterministic behavior was hard to control in more complex workflows and it did not leverage existing tools, requiring you to design your system in a certain way.
Flows AI is designed to fill this gap, providing a minimalistic and functional mechanism to orchestrate AI agents. It is compatible with any LLM provider and SDK, and comes with everything you need to get started to build your first workflow - simple, or complex.
Getting Started
It is available on npm today:
npm install flows-ai --save
Here is everything you need to know.
Core Concepts: Flow and Agent
At its core, Flows AI treats agents as simple async functions that take an input and return an output. This design makes no assumptions about what happens inside each agent, providing maximum flexibility in your choice of tools and frameworks.
Agents, together with their input, create a flow, the fundamental unit of orchestration. A flow defines the operation to execute (via the agent) and the data it processes (via the input).
The shape of the input is specific to the agent. Most of the time, it is going to be a prompt that includes instructions for a given agent.
In this particular example, we could implement the weather agent as follows:
Since we find ourselves prompting LLMs with Vercel AI SDK most of the time, we created a convenience helper to make it a bit easier:
It takes all same properties as Vercel AI SDK, with the only difference being maxSteps set to 10 as a sane default, so it acts more like an agent.
Executing Your Flow
All it takes is a simple function call to execute your flow:
Since the flow definition itself is nothing but a fully serializable object, it does not contain agents as functions but rather as strings. Before your flow gets executed, we first hydrate it. Hence you must pass an object that contains a key-value map of available agents.
Executing the function takes a few more options, such as onFlowStart, that may turn out especially useful while debugging. Here’s more about the options.
Controlling Flow With Built-In Agents
We now know how to define and create a simple flow with an agent of your choice. If that was enough, however, we could stick to Vercel AI SDK and call it a day.
Let’s now have a look at how we can run multiple agents in parallel.
In this example, we’re using one of the built-in control flow agents. We have modeled them after Anthropic’s article on building effective agents. In this case, we will run all sub-flows in parallel.
As mentioned at the beginning, flows can also be composed together to form more advanced flows together.
In this case, we will pick the best city based on weather conditions that we have previously run in parallel.
You can learn more about this and other flows in the documentation.
Future
The AI space is moving quickly, and there are libraries created every day that explore new ways of building systems that involve AI agents. This one is no different.
First and foremost, we would like to get your feedback on whether our core principles and design decisions resonate well with the way you think about agentic systems.
Then, we would like to hear from you whether the provided built-in features cover a wide (and enough) spectrum of the most common use cases when building systems like this.
Overall, it’s a very exciting time right now and if you’re reading this - congrats, you’re on the bleeding edge, exploring how to get AI agents to scale! Let’s enjoy the journey while we get there.