All Local Chat with PDF docs using AI, quoting sources

Hi all,
I copied this template and modified it for local use, but I’m running into issues where I get no response in the chat. The logs show output but nothing in chat.
I do see errors as well in the ‘Answer the query based on chunks’.

I have 0 experience in javascript.

Any help or suggestions are greatly appreciated!

Information on your n8n setup

  • **n8n version:latest
  • **Database (default: SQLite):postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • **Running n8n via (Docker, npm, n8n cloud, desktop app):docker
  • **Operating system:Windows 11 \ WSL2

Hi @initiate

Could be an issue with the model. Have you tried this with other models?

Can you try to change the prompt to something like "if you don't know return a json object that say's I don't know" and see if that helps?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.