Skip to content

This is our manifesto, this is why we build Pipelex.

First published on our blog on June 3, 2025.


The Knowledge Pipeline Manifesto

Agents are brilliant but they're hopeless at repeatability

AI agents are getting impressive at novel tasks: research, planning, coding from scratch. But here's what we're missing: lots of knowledge work isn't novel. It's the same patterns, repeated thousands of times.

Ask an agent to review 10,000 expense reports. Watch it improvise a new approach for each one. Different reasoning paths, different output formats, different edge case handling. Run it twice on the same report? Different results. The intelligence is there, but the consistency isn't. And someone’s paying for those improv tokens.

Agents are explorers. But if you know the path, you don't need an expedition. You need a bullet train.

Enter the knowledge pipeline

The solution should be obvious: delegate the repetitive tasks to deterministic AI workflows. But here's the deeper problem: AI workflows today are handcrafted, not engineered. We prompt, tweak, test, and pray.

Without proper tools for capturing repeatable workflows, every company recreates the same solutions from scratch. The expense categorization that took your team weeks to perfect? The next company starts at zero. We're in the pre-SQL era of AI: brilliant capabilities trapped in proprietary silos, no way to reuse what actually works.

We need a way to capture proven AI workflows and make them as reliable and reusable as any other software component. We call this a knowledge pipeline.

A knowledge pipeline is built from modular components called pipes. Each pipe is a knowledge transformer with a simple contract: knowledge in, knowledge out. Unlike data pipelines that move bytes or ML pipelines that train models, knowledge pipes transform meaning.

This isn't about dumbing down AI. Each pipe leverages full AI intelligence to handle variation in content while guaranteeing its output structure. You get deterministic structure with adaptive intelligence: consistency without rigidity.

The architecture of understanding

Knowledge pipes compose like LEGO blocks for repeatable AI workflows:

  • Connect them sequentially for step-by-step transformations
  • Run them in parallel to process multiple perspectives
  • Feed multiple outputs into a single pipe for synthesis
  • Call sub-pipes conditionally based on categorization

Why does this matter? Because real knowledge work isn't linear.

A pipe analyzing contracts might need context from step 2, results from steps 5-7, and external compliance data. The architecture mirrors how knowledge actually flows: not as a conveyor belt, but as a network of understanding.

The method becomes the artifact

When every pipe guarantees its output structure, you can compose them fearlessly. When each pipe uses AI intelligently, the pipeline adapts without breaking. When you capture your method as actual code, you can version it, test it, and scale it like software.

With knowledge pipelines, your process becomes something you can hold, examine, and refine. Swap out language models to find the sweet spot between cost and quality. Adjust prompts until they sing. Reconfigure the flow when you discover a shortcut. Each iteration teaches you something new because you're working with stable, measurable outcomes.

Remember when ChatGPT launched and suddenly everyone was sharing prompts? That same collaborative instinct is waiting to be unleashed on knowledge pipelines. A pipeline for extracting customer insights gets forked for user research. A financial audit workflow becomes the skeleton for compliance reviews. The building blocks translate across industries.

These pipeline artifacts are meant to be shared, forked, and composed. Period.

The path forward

We've seen this movie before. Docker transformed deployment by giving us a simple way to package and share what used to live in scattered shell scripts and tribal knowledge. Suddenly, complex configurations became artifacts you could version, test, and run anywhere. Better yet, Docker made it possible to build on each other's work rather than starting from scratch each time.

Today's AI workflows are ready for the same breakthrough.

This is why we built Pipelex as an open-source standard for repeatable AI workflows. Not just another tool, but a foundation for the community to build on.

The manifesto is simple: When you discover a method that works, capture it as a knowledge pipeline. When you build a pipeline that roars, share it with the world!

Join us in building the knowledge infrastructure of tomorrow. One pipe at a time.