In my workflow I need to execute a node without regard to the previous node and the input data.
More specifically I am pushing data into an airtable table. I then want to turnaround and read from the same table (with some filter criteria). However, because the previous node has input data, it’s running the search node once per input item.
In this diagram at Point A you can see where I’ve done an insert of 47 items into airtable. Then at Point B I want to read data from the same table but it’s executing the node 47 times because of the input data, returning a “fake” 2209 items (47 x 47).
How can I structure this so that after the upsert, I can “ignore” input data and just do a clean read of the table as my new input data for the next node?
I tried splitting it into two different workflows but then the execute workflow passes on the input data and I have the same issue.
Information on your n8n setup
- n8n version: 1.56.2 (hosed on n8n)
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via n8n cloud
- **Operating system: mac **