[Request for Help] Handling Uploaded Files in Chat Node

Hello everyone, I need to process the files uploaded in the chat node. Additionally, I would like to save the uploaded files to a new directory. These file paths may also be used in the AI agent.

Here are some situations to consider:

  1. Only chat text, no uploaded files.
  2. Both chat text and uploaded files.
  3. There are multiple uploaded files that need to be iterated and saved.

The link below is a problem I’ve encountered where the entire workflow isn’t fully functional. I haven’t found similar functionality in the forum template library. Could anyone provide some suggestions or assistance? Thank you in advance!
@jimleuk

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @ichat006

Great question and I think I did have this template in mind a while back. Here’s an approach you could take which handles multiple files.

If you don’t mind sharing, what’s your use case?

7 Likes

@Jim_Le
Thank you very much for the excellent use case you provided. It has pointed me in the right direction. Many of the logic handling aspects in the case were new to me, and I will need to spend some time learning to master them.

My use case involves printing user-uploaded files on a self-hosted n8n system running on Windows. I haven’t completed it yet, but I will refer to the example you provided to finalize my case. Thanks again!

@Jim_Le
Hello, I have encountered some issues that I am unable to resolve.

The workflow example you provided is designed for n8n self-hosted on a Linux system, but when executed on a self-hosted n8n Windows system, it prompts an error related to file save paths (e.g., /home/node/temp/aaa.docx), because the directory structure in Windows is different from Linux.

The issues I am facing:

  1. How can this workflow be made compatible with both Windows and Linux systems? As shown in the image below, I think it would be necessary to add if conditions in the red and yellow circle areas in the diagram.
  2. How can I write a prompt to enable the AI agent to call the “print file” tool and pass the file path? As shown by the green arrow in the image. If there are multiple files, the path needs to be passed multiple times. (On Windows, I am using the command start winword.exe /p "C:\Users\Administrator\Desktop\aaa.docx" to print files).

These two problems are quite challenging and may even be impossible to implement. It’s also possible that I’m wrong. :grinning:

workflow config code:

  1. Is it necessary to add a conversation ID in the AI agent’s text prompt? I’ve noticed that many workflows include this field, as shown in the image below.
    pic3

@Jim_Le
4. I tried to add a hook node to the workflow in order to expose it for external calls. I made the modifications as shown in the demo, but I keep getting an error
DemoVideo:

Workflow config code:

Hey @ichat006

I don’t have a windows machine to test unfortunately and am running n8n using docker.

1 . Linux or Windows?

If you just using windows (and looks like you need to use winword.exe anyway), there’s really no need to complicate to support linux as well. Just set the path to your desired windows location.

{{ `C:\\Users\\Administrator\\Desktop\\${$json.fileName}` }}

2. Print tool

So the idea is the agent will pass in the filepath and you use this to construct your command. Here’s an example.

1 Like

Just thinking you might be better off considering a simpler approach using n8n forms.

1 Like

@Jim_Le
Thanks so much for your help.

Using the upload button in the chat interface to upload files is very convenient.

My primary task is to enable this workflow to support webhooks as the backend, making it easier for me to invoke the workflow from the frontend page.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.