Collect webhook reply and send every 24h a report by creating a file

i have created a workflow that is listening to a webhook.
If the data from the webhook contains a special value in a string, then it will be forwarded in the workflow and enriched with other information.
My goal is to write every day at midnight (german time) a csv file with the collected data.
But i am not quite sure about the logic.
1.)Spreadsheet file
Do i have to set a “wait” node before the spreadsheet node to fill everything at once? how?
Do i have to delete the content of the spreadsheed node after the workflow proceeds ?
Or how i can teach this node to drop the content that has already forwared to the next step (ftp)?

2.) wait until
this is also quite tricky for me. if i use the “wait” node, i can´t add the specific date and time. if i work with an expression i am not sure if it will recognice this as a date,if i write
{{${days: 1}).toFormat(‘dd.MM.yyyy HH:mm:ss’)}}
also, this parameter does not include the information that ti should wait until midnight

Hey @elbrucko,

There is a problem with your workflow, Because of that wait node your workflow is going to run once for each webhook request so you could end the day with 50 workflows waiting to upload the same file.

What I would do is create 2 workflows… The first one would be your webhook > spreadsheet flow but delete everything after the write and add a write Binary File to save the file to disk then create another workflow that runs on a schedule at midnight that reads the file then uploads it.

hi Jon,
thanks for your reply. But Write to binary file" is not supportet (“no access”): do you have a document online that explains the server-settings for this purpose?

Hey @elbrucko,

I think we mention in most of our setup guides that you would need to mount an additional data volume for storing files.

What you can do instead of that though is save the files to /tmp/ so for the path you would use /tmp/my-file.csv and that should do the job.

HI @Jon
Thanks, this works. I have copied the workflow and created,as suggested, two. The first ends with Write to binary file" (currently for demonstration i have simply removed the connection-path to the last elements), the second starts with a daily trigger, followed by read binary file, ftp-upload and then ftp delete file ( the previous upload).
Please find my workflow below.
Does it make sense now? do i have to tick the checkbox " append" at the binay file to ensure that ALL daily elements will be stored in one file and the elements of the next day goes into the next file?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.