Hard time with appending to files (json,csv)

Describe the problem/error/question

I’m seeing unexpected results when using read from file - write to file (with append)…

What is the error message (if any)?

Unexpected non-whitespace character after JSON at position XX

The issue is that when new data is appended to same file it joins inputs but in same way it breaks structure. E.g. if I utilize csv, next time when it appends data it will also append header again, and it will break data. When it comes to JSON it will append something like this

In step 1. - creation of file will add →

[
  {
    "appended_task_domain": "test.powerappsportals.com"
  },
  {
    "appended_task_domain": "test2.powerappsportals.com"
  },
  {
    "appended_task_domain": "test3.powerappsportals.com"
  }
]
  1. Next time when it appends it will add again
[
  {
    "appended_task_domain": "test4.powerappsportals.com"
  },
  {
    "appended_task_domain": "test5.powerappsportals.com"
  }
]
  1. Final look of json content will be:
[
  {
    "appended_task_domain": "test.powerappsportals.com"
  },
  {
    "appended_task_domain": "test2.powerappsportals.com"
  },
  {
    "appended_task_domain": "test3.powerappsportals.com"
  }
][
  {
    "appended_task_domain": "test4.powerappsportals.com"
  },
  {
    "appended_task_domain": "test5.powerappsportals.com"
  }
]

As you can see, the JSON is not real JSON and it will break on node “extract from file (json)”

Please share your workflow

Share the output returned by the last node

After first extract from file (json)

[
{
"data": 
[
{
"appended_task_domain": 
"test.powerappsportals.com"
},
{
"appended_task_domain": 
"test2.powerappsportals.com"
},
{
"appended_task_domain": 
"test3.powerappsportals.com"
},
{
"appended_task_domain": 
"test4.powerappsportals.com"
},
{
"appended_task_domain": 
"test5.powerappsportals.com"
}
]
}
]

Information on your n8n setup

  • last docker image pulled today, no other custom modifications

Disclaimer :slight_smile:

I am quite sure that I made mistake with data formatting somewhere…any help or hack/tip is welcome as most of workflow is done and the only thing not working is saving and reading from file after multiple times

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

So… I was able to get the appending to work by utilizing merge…

When file is read on start of workflow, that is first input in merge.
Second input is new data generated from workflow.

They are appended using merge action and saved (overwrite) to “same” file.

I would still appreciate if anyone can suggest how can this be solved more in proper way but for now it works.

Thanks

Hey @antun4n6 , combining 2 data sets with Merge node sounds reasonable to me. Your aproach is valid.

Having said that I see no appending step in your workflow to comment on that.

Hi, thank you for response.

When I referenced append, I meant on node “Read/Write files from disk1” which has toggle “append” which was enabled, so I thought that it will append data to file. (well it did but not properly)

BR

Well, the “Append” options means appending the conent to the existing file on the disk (as opposed to overriding it). It is similar to >> operator in Linux as in echo 'text here' >> filename.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.