Re: Transmissions : a pipeliney thing

Interesting!  It definitely has similarities with the RDF Pipeline 
Framework that I was building several years ago.  Here is a paper that 
describes the ideas:

http://dbooth.org/2013/dils/pipeline/Booth_pipeline.pdf

The initial prototype was written in Perl, then ported to JavaScript 
using the NoFlo library for GUI.  NoFlo turned out to be the wrong 
choice in retrospect, so that port never worked out very well.  I 
intended to port it again (to Python), but haven't got to it.  And 
unfortunately, with USA's fast-approaching fascism it now feels like a 
lower priority use of my time, so it's doubtful that I'll ever finish 
it.   Nonetheless, there are some ideas in it that you might find 
interesting, such as:

  - The nodes of the pipeline (which process and store data) could be 
written in any programming language for which a host environment has 
been implemented -- kind of like a container.  So it allows for very 
heterogeneous pipelines.

  - When adjacent nodes happen to live in the same host environment, the 
communication is optimized to skip HTTP and 
serialization/deserialization.  Otherwise the default is HTTP, with 
serialization/deserialization done automatically by the framework.

  - All execution and data transmission is dependency driven, with the 
dependencies figured out by the framework, so nodes don't have to worry 
about it.  Nodes only need to concern themselves with their 
domain-specific functions, and do not require any framework-specific 
knowledge.

  - Pipelines can have back-propagation of request information, to avoid 
generating data that nothing needs.

Github repos:
https://github.com/rdf-pipeline

Best,
David Booth

On 8/16/25 14:49, Danny Ayers wrote:
> So many of the things I wanted to do with code were of a pipeline shape 
> I ended up writing a thing.
> 
> The workflow "transmission" is defined in a Turtle syntax list. Written 
> in node.js. It is kinda DSL, and I have played fast and loose with 
> namespaces. A Transmission is a series of Processors.
> A message - a JSON object gets passed between them. (Which sometimes 
> contains an RDF graph, but of course).
> 
> Event driven. So for my ridiculously over-engineered static site builder 
> for my current blog, a filewalker triggers events for each markdown file 
> it sees and passes them along, spawning a chain of processors for each. 
> (I did start a graph-shaped GUI to wire things up, but this behaviour 
> from event-driven made it kinda useless. They just spawn on their own).
> 
> An observation I had along the way was that GPTs are good at small 
> things, hopeless with big ones. a Processor in this system just takes a 
> JSON object, does process(message) and passes it along.
> 
> I'm using it for stuff, I doubt anyone else would want to. But I think 
> the use of Turtle as a DSL is kinda interesting.
> 
> It has been a lot of fun. The processor interface is so trivial, have to 
> try things. Somewhere in there I have my own Lisp-like language, a 
> reasoning chain based on EYE, etc.
> 
> Cheers,
> Danny.
> 
> https://github.com/danja/transmissions 
> <https://github.com/danja/transmissions>
> 
> 
> 
> -- 
> ----
> 
> https://danny.ayers.name <http://hyperdata.it/danja>
> 

Received on Sunday, 17 August 2025 01:28:24 UTC