In my AI Agent node the prompt I gave is “You are a travel assistant. You plan 4-day trip itineraries. I will provide a city name, and you will respond with a detailed 4-day itinerary for that city. “
I am building a simple agentic workflow where the input from user is a City Name like Bali and the output is an travel itinerary.
The LLM response I am getting is “{“output”:“Got it! Please tell me the city name you want the 4-day itinerary for.”}”
The workflow has the following nodes:
Webhook - To take the response from Google Colab in POST method
AI Agent - LLM, Tool, Memory
Let me know how to overcome this issue and get a proper response from LLM.
Can you please share your workflow JSON with us so that we can spot the issue, i guess if you are getting a response from the AI Agent which means your webhooks are working as they should, not the problem remains with the AI agent, your prompt you are showcasing here i am assuming it is set to as a SYSTEM PROMPT not the prompt to the AI Agent as the prompt will be the query user have posted using the webhook, let me know if this is correct and if you can share your workflow, hope this helps!
Your code is inside a text-field. You must add all of your code into a code field. You can create one like depicted in this screenshot. If you hover with your cursor over such a position, the ‘+’ will appear:
Hey @Kaushik_AI , just as i have said earlier, you have set your “You are a travel assistant. You plan 4-day trip itineraries. I will provide a city name, and you will respond with a detailed 4-day itinerary for that city. “ Which should be a system prompt as a an AI PROMPT which means AI always gets that as a input not the once user is requesting through the webhook, I have made some fixes in your flow now it should be ready:
Expected to find the prompt in an input field called ‘guardrailsInput’ (this is what the guardrails node node outputs). To use something else, change the ‘Prompt’ parameter
import requests
# 1. Your n8n production URL
n8n_production_url = ""
# 2. Data to send as query parameters
query = {
"prompt": "Bali",
"sessionID":"h46767"
}
# 3. Execute the GET request
response = requests.post(n8n_production_url, json=query)
# 4. Print results
print(response)
So with this your Webhook will get triggered and also it will return the data by the respond to webhook node, i have checked this in the below scenario which is fairly simple but that’s just a case and the AI Agent will just be in between:
Let me know if you need more help, hope this helps!