Error 413 with Elasticsearch

Hello there,

I have a workflow running for weeks and somehow it stops working giving me a 413 error. The issue is generated with Elasticsearch node. I have checked the memory availability in the server and we have plenty. Therefore, could you have any clue why this may be happening? Please find attached the workflow.

GA and Google Sheets nodes work just fine. Any help is much appreciated.

The HTTP error code 413 means “Payload To Large” so the request is larger than what the server allows.

If you say it happens on the Elasticsearch node, then there are probably two possible solutions:

  1. Reduce the size of the request (for example query the data more often in smaller chunks)
  2. Increase the size of the requests which are allowed (do sadly not know enough about Elasticsearch to know if and how that is possible)

Hope that is helpful!

Hi, @jan,

Thanks for your quick answer. Checking your suggestions:

  1. I am triggering around 700 rows of information every day. I tried dividing it in chunks but still the error appears.

  2. Elasticsearch allows you to put and pull 10.000 rows per time.

It seems like the possible solutions may not be efective for the purpose, or perhaps I am interpreting it in the wrong way. Is there any other idea I may test?

Best,
Juan

It matters less how many rows get added, but rather the size of the data of the whole request combined. So 10k small rows will probably work (because < x), but 1 row with a lot of data would fail (because > x).

Anyway, generally then worth exploring 413 errors in combination with Elasticsearch specifically:
https://www.google.com/search?q=elasticsearch+413&oq=elasticsearch+413&aqs=chrome..69i57j0i22i30l4j69i60l3.3986j1j4&sourceid=chrome&ie=UTF-8

Thanks @jan it led to the actual solution.

That is great to hear! Could you please share what helped in your case to make sure that other people which have the same problem know what to try.

Hi @jan,

Elsaticsearch has a limit of 100MB under “http.max_content_length”. Depending on how you are dealing with ES there are two ways I am aware of:

  1. Go under Stack Management > advanced settings > change the default limit.
  2. Create an elasticsearch.yml where you increase the default file size.
1 Like

Great, thanks a lot for sharing! I am sure it will be helpful for other people as well!

Have fun!

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.