Read and Write local JSON (and csv) files

Ability to read and write json files from disk. Like the FTP or Read/Write binary files just for JSON (and CSV) files.

My use case:

Some vendors in their infinite wisdom don’t offer filtering capabilities on API calls. Often, I have to pull in entire datasets and filter that data myself. When these datasets grow, they starts having negative affects when trying to hit the API quickly/frequently, often making the application services very slow during API calls.

This request is essentially the ability to cache the output of an API call to disk, and then read that in a workflow as if it were the original API call.

I think it would be beneficial to add this because:

Would simply allow for caching of large datasets.

I would like to see this output that file’s contents exactly as if it were the output of an HTTP API call.

Note, I’ve tried many angles on how to get the write binary file and the move binary data to work for this. move binary data json>binary breaks down large json data into individual records and writes them over the top of the file, or the append option appends to previous executions and makes for an unusable file when trying to convert back from binary to json. It would be nice to simply write the file in it’s native format and read it back.

Thanks.