Obtaining binary data size to send in upload session with Sharepoint Graph API

We are attempting to create workflow(s) where we either download files (xlsx,csv,pdfs) from S3 or create them ourselves from data obtain from SQL calls and then upload them to Sharepoint sites.

The issue we are running into at the moment is getting the correct file size to pass in the headers (Content-length & Content-Range) for the PUT request to the Sharepoint Graph API.

We need to know the file size because of specifications of the API anything under 4MB goes via a normal PUT where anything over needs to have an Upload Session created and the file(s) sent that route.

In the example i have attached the pdf is 6010206 bytes on disk but when i run the workflow i get the following error returned.

“message”: “400 - {“error”:{“code”:“invalidRequest”,“message”:“The Content-Range header length does not match the provided number of bytes.”}}”,

I was wondering if reading the binary file back in will change the size or if there was a function i could use to get the size?

Any help in this area or other suggestions would be greatly appreciated.

thanks again for everyone’s support.

Our set-up is N8N (1 node) running in Docker on a t4g.small instance with the default container resource limits.

Database is Postgres running in Docker

Hi @messi198310, it sounds like you might need to upload the files in chunks rather than in a single request which wouldn’t work through the HTTP Request node at the moment unfortunately.

However, if all you need is the size of a binary file in bytes, you could read it for example like so (simply replace data with whatever your binary property is called):

Example Workflow

File size returned by the workflow vs dir:
image

Should chunked uploads be needed, you might be able to write custom code in order to manually split up binary data and then perform as many requests as needed, but I am afraid I don’t have any experience with this and no great pointer as to where to start here. A basic example of how to manually upload a (whole) file in the Function node can be found here:

You might also want to raise support for chunked uploads as a feature request in this case as I imagine this might be useful for quite a few other cases.

1 Like