Background: Part of our product offering involves receiving in “feed” files (csv files typically) of product inventory, parsing them with various manipulations, and then spitting the resulting data into MySQL and ElasticSearch. We then have another process that we refer to as “syndication” where we query those 2 databases and send specific subsets of the data out to other services for marketing purposes (eg. listing items on Facebook ads). Right now this is all handled by a pair of pseudo-ETL products that were developed in house over a decade ago, by people who are long since gone.
Fast forward to today. I stumbled upon n8n while perusing Reddit and thought… this is WAY cleaner than what we’re using! I managed to get a crude proof of concept made up in a couple hours. I’m sticking on one bit though.
On the “syndication” part, each destination has a pretty rigid template for how the data needs to get to them. But each of our customers (often) requires manipulations on the data. So, I really need one workflow per customer job, which can be built from a template. However, I don’t really want to setup 100s of triggers to fire off each one of those workflows. Ideally I’d like a way to have a master workflow (or maybe a master for each destination) that could loop through the other workflows and run each one in turn. I know there’s the workflow node, but I don’t want to have to create a separate workflow node for each “sub workflow”. I’d like to be able to say “grab all workflows named facebook* and run them all sequentially”. Is there some way to achieve this? Or perhaps a better methodology?