What is best way to collect multiple files from github and send to Agent?

I am a little confused on how first to pull files (.yaml and .md) from github, in terms of binary, format, etc. And then how to send them all to an Agent to be used in an analysis, etc. Does anyone have any thoughts or some examples?

A couple of specific issues I found,

  1. I read various things on here about “Move binary data” node, but that doesn’t seem to exist anymore.
  2. I am guesssing I would need to load all the files seperatly, and then merge them into a single flow, but how to not totally merge, so that the LLM would still know they are different files, sources?

Hey @ForceConstant hope all is well.

You can download files from github either with the github node or via API call from HTTP Request node.

As for uploading to LLM for analysis and whether to do it in one go or one by one - this depends on your needs, whether they could be analyzed at once, how big they are and what sort of analysis you need. Do you have a specific LLM on your mind?

They are not real big, and will probably use gpt-5 or gemini-2.5. Overall I want the LLM to have access to all the files when doing the single analysis.

so currently I have 3 flows with each a github→get-file binary, and then use move binary-to-json. But at this point, then not sure what to use. Merge seems to just combine the files into a single file.

When you say “not big” - could you approximate that in characters, bytes or tokens?

Do you have an example of

  • set of files
  • analysis prompt

for tests

The files / analysis can’t be shared, but they are less than 200 lines, and 8kb in size.

Ok here is what I have so far

Sending the prompt as

[Your instructions go here. Be very clear about what you want the model to do with the files.]

Here are the files:

--- FILE: deployment.yaml ---
```yaml
[Paste the full content of deployment.yaml here]
--- FILE: service.yaml ---
```yaml
[Paste the full content of service.yaml here]
1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.