Predefined data structure between workflows

I am planning to create my automations with “Break large workflows into smaller components” philosophy. Because of few reasons:

  1. I dont know how to create my own nodes yet.
  2. Some “small components” will be just a bypass to get some data from http request, and due to the api request limits I want few other scenarios to “bottleneck” on small workflow so I dont break limits of the api.
  3. My experience in make.com suggest that creating big workflows, creates also big “technological debt” where it’s very hard to make future changes because there are too many dependencies.

Right now I am trying to plan everything ahead and also understand n8n. And the question is:

Is there any way I could create predefined data structure for those small workflows, and maybe some data structure validation to make sure these workflows will run correctly?

Thank you , best regards

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @Kamil_Muras,

We don’t really have anything inbuilt for that but you could in theory create your own checks in the sub workflow with if nodes to make sure the JSON schema is what you would expect.