Requesting the integration of a Memory slot feature for the Basic LLM chain in n8n, akin to what is currently available in the Agent chain. This enhancement would automate conversational context management, eliminating the need for manual history tracking. It aims to streamline workflows that require context awareness, increasing efficiency and reducing error potential.
- Simplifies workflow creation by automating context retention.
- Enhances the reliability and accuracy of conversational applications.
- Makes the Basic LLM chain more versatile and user-friendly.
Rationale: Other chains like the Agent chain already benefit from this feature, significantly improving user experience and workflow capabilities. Implementing a similar Memory slot in the Basic LLM chain would extend these advantages, promoting consistency and functionality across n8n’s offerings.