I’m trying to split the input data for OpenAI in multiple chunks due to token limits.
So basically I would like to have an introduction prompt that says “Hey, you will do this and that and you will receive the needed data in the next n messages. Once you received the last message, please execute the task and return the answer”.
So preparing the task with multiple messages and then returning the answer.
Is that possible with the existing nodes yet?
I tried the OpenAI node and the AI agent but could not identify a way to do this.
Thanks in advance!
It looks like your topic is missing some important information. Could you provide the following if applicable.
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system:
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.