my work flow in this example is fetching pages to scrape, im trying to think on a way to collect all the data to one list, for now i need to add the data to a database on every page, instead of running the scrape process and when finish to send all
Please share the workflow
try to run it you will see loop with 3 entries for 3 pages, how can i collect the data and send once in the end?
just an overview on this project, im creating npmjs automation and a dashboard using appsmith
Information on your n8n setup
n8n version:latest
Database: postgres
Running n8n with the execution process [own(default), main]:
// Get allways the last runIndex ... https://docs.n8n.io/nodes/expressions.html#variable-node
// Needed because $items('HTML Extract', 0, i) throw an error if 'i' is not a runIndex
const maxRunIndex = $node["HTML Extract"].runIndex;
let packages = [];
let i = 0;
while(i <= maxRunIndex) {
// Get the special attribute ... https://docs.n8n.io/nodes/expressions.html#method-items-nodename-string-outputindex-number-runindex-number
const _packages = $items('HTML Extract', 0, i)[0].json.packages
packages = packages.concat(_packages)
i++;
}
// Map it for n8n B-)
return packages.map(e => { return {json: {package: e}}});