Do all Community Nodes (eg Scrapeless) need reinstalled after n8n updates?

Hi

Every time I update n8n (locally installed community version) all Scrapeless community nodes become unrecognised and the only solution is to uninstall/reinstall the Scrapeless node - and add/reconfigure every instance of it being used in any workflow.

Is this normal behaviour for community nodes, or is it avoidable - by me doing something, or by Scrapeless if it’s brought to their attention?

Thanks

Stewart

Hey @selbrae hope all is good. Welcome back!

I have never tried it, but take a look at the following option:

N8N_REINSTALL_MISSING_PACKAGES

Documentation - here and here.

Hi @selbrae, are you installing as standalone or in a docker container? If the latter, just make sure your data is persisted and you can easily spin down and recompose your containers without losing anything. For standalone, just make sure you’re copying your original .env file into the install folder, especially if you just use git to get the latest updates and it inadvertently overwrites.

@jabbson and @PowerBoot thanks for those suggestions.

I should have included that this instance is in a docker container - and I do have persistent storage and I guess it is working as I’m using SQLite for some workflows and I can see the db there.
However, the nodes subdirectory in that volume contains only package.json, and the contents of that file are only:

{
“name”: “installed-nodes”,
“private”: true,
“dependencies”: {}
}

I installed another community node just to see if it behaved differently from the Scrapeless one, but no changes to that file so I’m wondering if that is where the problem lies …

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.