hi all,
want to self host n8n locally on a vm at home, do i just follow this guide
ie install npm on my rocky vm then
npm install n8n -g
n8n start
do i need to install next?
then whats the port to open it on the web
thanks,
rob
hi all,
want to self host n8n locally on a vm at home, do i just follow this guide
ie install npm on my rocky vm then
npm install n8n -g
n8n start
do i need to install next?
then whats the port to open it on the web
thanks,
rob
Hi there, you can follow that guide definitely,but first make sure you have npm installed on your vm and then you can follow that guide
and then you do not need to install the next version, unless you really need to, as in you want to try the pre-release or the beta version of n8n
and for the port, it should be by default to be 5678
thanks @fahmiiireza also ive seen some guides where i need to NAT it on my fw, do i need to do that?
no @robertkwild u don’t need to do that to be able to run n8n succesfully on your computer, but let’s say you want to access the n8n from other devices then yea, you need to set up NAT in order to make that happen
thanks @fahmiiireza as im selfhosting ollama aswell (with deepseek and openwebui) so can i use n8n with my self hosted ai
yes, you can definitely do that, you just need to configure the ollama properly for it to be able to be used in n8n and for the ollama part, i suggest you also follow this tutorial
if this answers your question please give all of them a like and mark it as the solution as it would greatly help me and the community
how are you accessing it? you use localhost right? and not use safari
what you are seeing here is because of the cookie policy, which you can turn it off by setting this env variables to false
N8N_SECURE_COOKIE=false
thanks @fahmiiireza i googled it lol, so il make a cronjob to start n8n automatically when linux starts and as the exports isnt persistent il do this in script
#!/bin/bash
export N8N_SECURE_COOKIE=false
n8n start
is there a way i can do this without a script ie systemctl enable n8n etc
ive read the starter kit and this is to install n8n/ollama all on one server but in my situation ive installed ollama on my pc which is windows and my n8n instance is on a linux vm
let me know
thanks,
rob
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.