Massive data integrations

Hi.

I wanted to know if I can use the tool for massive data integration, using multiples workflows (around 40) using FTP, HTTP, among other ways to integrate.

  • Does it have scalability?
  • How can I monitor possible errors in the process?
  • Is there any way to monitor performance?
  • Are there support options for private deployments?

And lastly, is he recommending to use that way?

Hey @thiagomaraujo03!

Welcome to the community :sparkling_heart:

To make sure that I am addressing your questions correctly, can you please let us know if you’re talking about n8n.cloud, or are you planning to self-host n8n?

Hi,

Thanks for the help!

I am planning to use AWS services with n8n Docker.

Thank you for sharing that. To answer your questions:

Does it have scalability?

Yes, n8n is scalable. We have documented the process of how to scale your n8n instance. You can find more information here: Scaling n8n | Docs

How can I monitor possible errors in the process?

You can set up logging in n8n to monitor the errors: Logging in n8n | Docs

For each workflow, you can also create error workflows using the Error Trigger node.

Is there any way to monitor performance?

I don’t have an answer for this, if you can share more information, I might be able to provide you with a definite answer.

Are there support options for private deployments?

What kind of support are you asking for? You can always ask questions on the community forum :slight_smile:

If you haven’t I would suggest you also take a look at our license. Here’s the FAQ around that: FAQ | Docs