Cloud Warehouse/Workshop Model: everything is a pipeline, the perfect way to achieve the simplicity and unity of the software ecosystem

Cloud Warehouse/Workshop Model

  • Everything is a pipeline: the perfect way to achieve the simplicity and unity of the software ecosystem.

    • Pipeline combination: The cascade of simple pipelines forms a workshop pipeline

    • Workshop combination:

      • Parallel independent workshop pipelines
        form warehouse/workshop model pipelines (data factory)
        through warehouse collaboration (scheduling).
      • The workshop pipeline can be used as a packaged integrated pipeline (integrated chip)
        to independently provide services to the outside world, which is a microservice or service industry.
    • Warehouse: It can provide services as an independent entity, which is a warehousing industry or database server.

    • Warehouse/Workshop Model combination:

      • Various independent Warehouse/Workshop Model pipelines (data factory) can be used as a
        packaged integrated pipeline (data factory, integrated chip) and then combined into a larger
        Warehouse/Workshop Model pipeline (data factory),
      • This is the method of interconnection, collaboration, and integration of different developers and
        different software products, and is also the basis for software development and standardization of
        large-scale industrial production.
      • This is the corporate group or the entire industrial ecosystem.
    • Finally, the pipeline is like a cell, combined into a pipeline software ecosystem
      that meets the requirements of the modern industrial ecosystem.
      This is the perfect combination of simplicity and unity.

    • Contrast with Everything is an object

      • The object is a furry ball, and there is a chaotic
        P2P (peer to peer) network between the objects.
        It is a complex unorganized system.
      • The pipeline is a one-way ray,
        and it’s the simplest that data standardization
        and combination. It is a simple, reliable, orderly,
        observable and verifiable system.
  • Because software is a factory that produces data, so modern industrial systems are suitable for software systems.

  • Warehouse(database, pool)/Workshop(pipeline) Model is simple and practical model,
    and the large industrial assembly line is the mainstream production technology in the world.

  • The best task planning tool is the Gantt chart, and the best implementation method is the warehouse/workshop model implemented by the factory.

  • My programming aesthetic standards is “simplicity, unity, order, symmetry and definiteness”, they are derived from the basic principles of science. Newton, Einstein, Heisenberg, Aristotle and other major scientists hold this view. The aesthetics of non-art subjects are often complicated and mysterious, making it difficult to understand and learn. The pure function pipeline data flow provides a simple, clear, scientific and operable demonstration.

I’ve read over your github page a few times, but I think without code example, especially of a non trivial problem, I have a hard time fully visualizing what you’re describing concretely. It be nice if you could add such code example, and maybe even a sample project in that style.

It never ceases to amaze me how people find ways to add anti-OOP rhetoric to everything. The system specified is a natural fit for OOP. As such, it’s quite commonly used. In the diagram, each box is an object and each arrow is an interface. The furry ball you’re describing is an architectural design decision that could just as easily be used in FP if you’re not careful.

2 examples. One, multimedia processing. If you look at the diagrams for ffmpeg’s filter graphs, they’re remarkably similar to the diagram here. Each warehouse would be a finished frame and each worker is a different filter.

Two, game engines. The warehouse in the left diagram would be the state after each tick. The pipe functions are things like move the objects, calculate collisions, decide results of those collisions, etc. The main warehouse in the right one would be the generated frames, with external workshops for sound, the game logic, etc.

In both cases, pure functions are beneficial. John Carmack, for example, has been an outspoken proponent of pure functions.


OO and pipeline dataflow are two different worldviews. The two are equivalent. pipeline data flow tends to strictly separate code and data, and supports parallel programming better. The warehouse saves the global state. When there is a problem, Snapshot warehouse can easily analyze system problems.


I have very little time to write. I also write a small piece of text every time on this blog. I will write a small demo project if I have time in the future.

An application case:

Apple M1 chip adopts Warehouse/Workshop Model

  • Warehouse: unified memory
  • Workshop: CPU, GPU and other cores
  • Product: information

there’s also a new unified memory architecture that lets the CPU, GPU, and other cores exchange information between one another, and with unified memory, the CPU and GPU can access memory simultaneously rather than copying data between one area and another. Accessing the same pool of memory without the need for copying speeds up information exchange for faster overall performance.

reference: Developer Delves Into Reasons Why Apple’s M1 Chip is So Fast

update: add framework code of W/W Model

;workshop is pipeline(pure function)
;It is run after the scheduler allocates the initial data (parameter), 
;and its output data (return value) is also "received" and "processed" by the scheduler.
(defn workshop [init_data]
  (->> init_data

(def warehouse (atom {}))

(defn scheduler [key reference old-state new-state]
  ;1. According to the new status (such as orders, etc.) 
  ;   scheduling workshops to complete tasks.
  ;2. Side effects: 
  ;   2.1. Interact data with other warehouses as needed 
  ;        (distributed,  other databases, disk, etc.), 
  ;   2.2. persist data, etc.

(add-watch warehouse :scheduler scheduler)