Hey everyone,
I’m running into a persistent payload string too long error using the Postgres node, and I’m hoping someone can point me in the right direction.
What I’m trying to do:
I am using the Postgres node with the “Execute Query” operation to run a custom INSERT INTO ... VALUES (...) RETURNING *; statement. I need to pass 27 parameters to this query.
The Data:
The payload is quite large. It contains long text strings (including Marathi language characters and punctuation) and an array (follow_ups).
The Error:
Failed query: INSERT INTO "public"."ConversationMessages"... payload string too long
What I’ve tried so far (and failed):
- Plain Text Parameters: Passing parameters as a comma-separated list
{{ $json.field1 }}, {{ $json.field2 }}. This failed because n8n split the parameters on the commas inside my Marathi text.
- JS Array Parameters: Passing them as a strict JavaScript array in the Query Parameters field
{{ [ $json.field1, $json.field2... ] }}. This is where I first hit the payload string too long error.
- Single JSONB Parameter: To bypass the array parsing, I tried passing the entire object as a single parameter (
$1) using {{ JSON.stringify($json) }} and parsing it natively inside Postgres using $1::jsonb->>'field'. This also threw the payload string too long error.
It seems like the n8n expression engine or the Postgres driver is completely choking on the sheer size of the stringified payload being evaluated in the Query Parameters field.
Does anyone know the maximum payload size for Postgres query parameters in n8n, or is there a known workaround for large Execute Query inserts that require a RETURNING * statement?
(Note: I know the native “Insert” operation might handle this better under the hood, but I’m trying to see if “Execute Query” can actually support payloads of this size without crashing).
Thanks in advance for any help!
Hi @Yash_Nerkar
Try this fix to bypass bound parameters entirely by building the SQL string in a Code node before your Postgres node, then pass it as a plain expression ={{ $json.sqlQuery }} with Query Parameters left empty. This way the postgres driver never has to bind large values as parameters, which is where the size limit hits.
Tell me if this works 
1 Like
Hey @Yash_Nerkar! This error often pops up when the expression evaluator or the underlying driver hits a memory/string limit during binding.
Since you need , building the query in a Code node (as suggested) is a common workaround, but manual escaping for 27 parameters (including Marathi characters) is a nightmare waiting to happen.
A cleaner way to handle large payloads in Postgres while keeping is to use the native ‘Insert’ operation but set the ‘On Conflict’ action to ‘Nothing’ or ‘Update’ (even if you don’t expect a conflict). The native Insert node in n8n uses a more efficient batching/streaming approach under the hood than the ‘Execute Query’ parameter binder.
If you absolutely must use ‘Execute Query’, check if you have or similar environment variables set, though that usually triggers a different error.
Another trick: If your Postgres is on a recent version, try sending the payload as a Base64 string and decoding it in the SQL query: . Sometimes the driver handles binary/base64 strings differently than raw UTF-8 blobs in the expression field.
Hope that helps you get those Marathi strings stored!
1 Like
Holy shit, sorry about that!
My last post got absolutely butchered by the shell. I should know better than to pipe raw strings into a CLI without escaping.
Here’s the actual technical stuff I wanted to share:
A cleaner way to handle large payloads in Postgres while keeping is to use the native ‘Insert’ operation but set the ‘On Conflict’ action to ‘Nothing’ or ‘Update’ (even if you don’t expect a conflict). The native Insert node in n8n uses a more efficient batching/streaming approach under the hood than the ‘Execute Query’ parameter binder.
If you absolutely must use ‘Execute Query’, check if you have or similar environment variables set, though that usually triggers a different error.
Another trick: If your Postgres is on a recent version, try sending the payload as a Base64 string and decoding it in the SQL query: . Sometimes the driver handles binary/base64 strings differently than raw UTF-8 blobs in the expression field.
1 Like
Since the Code node workaround didn’t fix it, the issue is likely the underlying Postgres driver’s parameter binding limit, not n8n’s expression engine. Try the native Postgres → Insert operation instead of Execute Query — it bypasses the parameter binder and handles large payloads better. For your follow_ups array, set that column to JSONB and pass {{ JSON.stringify($json.follow_ups) }}. If you need RETURNING *, add a second Postgres node right after with SELECT * FROM "public"."ConversationMessages" WHERE id = (SELECT lastval()).
Okay will try it, but ideally why this issue occurs?