I would like to extract the session code I get which will be in between HEREISMYSESSIONCODE as a new json dataformat in N8N so I can reuse this session code in another node.
Here is what I receiver from my http request from the API:
Hey @frankemann - you can achieve this using n8n expressions (so can be done inside any node parameter that expects a string/ text value). At a high level, we need a function that cuts everything off before and including <LoginResult> and cut off everything after and including </LoginResult>. My solution below works under the assumption that data only has 1x <LoginResult>
Here’s the snippet I used - should work on your data (I did however have to stringify your example data first as it had some " and similar chars that needed to be escaped)
$json.data. - just referencing the data variable (which is json data, so we use $json to access it)
First split function, splits the data string at any point it identifies the search string “”. Split function also outputs an array of strings. We expect to get 2x strings in that array, since “” only occurs 1x time in our original string. With that in mind, we specify the 2nd element of that string with the [1] in .split("<LoginResult>")[1] (1 because arrays start counting from 0)
Second split function does something similar, but basically to cut off everything after and including </LoginResult>
Hope the explanation helps you understand what’s happening so you can apply it to other scenarios. If you need to do something similar in future but a bit different, check out this MDN JavaScript reference of string methods (functions) - we’ll soon have most of these functions in our expressions autocomplete feature too.
The only thing is that this SET node doesnt automatically run when linked to the previos http request. I have to go in and manually run it. Even if I link them, strange?
Got a screenshot? If your workflow has a trigger node and HTTP Request + Set node are connected to it; it should all run automatically when you click the “Execute Workflow” button.
If you click the “Execute node” button of any node, it will always execute up to that node. That means, it’ll execute the current node (where you clicked “Execute node”) and will execute earlier nodes if they have not been executed (otherwise uses existing output data of that node).
Interesting!
I have 3 different flows starting when i push execute. I perhaps thought they would run aside with each other but technically the run one and one. And one node i am working on, on one of the flows, stopped the other ones to execute. If I disabled the one I am working on (with some errors in it) it works…
Its the session code at the bottom that stops if I enabled the “get latest invoice” in second line.
@frankemann thanks for the context. It does indeed seem like what you have in your screenshot should not happen and points to a bug.
If I’m understanding your example correctly, when “Get latest invoices” node is enabled - it should fail and that would mean that “247 session code” should not run. It is however running, but then not executing SessionCode.
Any chance you could share the workflow JSON with me? A DM is totally fine if you’re not comfortable sharing it publicly. Before sharing, please just check that you don’t have API keys hardcoded in the nodes (if you’re using n8n credentials, then you’re good as those would not export with the workflow JSON). This way we can take a look, identify any logic bugs and get those deployed soon
Any chance you can DM me that workflow so we can troubleshoot further and identify a potential bug?
As for your other question, would you mind creating a separate post for it? Someone else on the forums might be better suited to answer that question (answer probably requires using JS in the Code node), and helps with discoverability for other users in future.
Hi Max.
I was reading you post again and I think you misunderstood. The problem was when “Get latest invoices” stopped, all the 3 parallel worksflow stopped, f.ex the node (Sessioncode) at the bottom.
My expectetions was since i have 3 separate lines from the Execute button, I thought they would go 3 separate ways regarding errors. But as it seems as long as one node fail somehow, it will stop also for the 2 other lines. I guess this is by design?
So what’s happening in your screenshot does seem like a bug (247 session code should not have executed). We did have a similar issue reported this week, let me follow up on that and see if it’s resolved. If you are able to share the flow, that could help diagnose CC’ing @Jon in case I’m missing something obvious
Also here are some details on the expected logic, if you or anyone else is curious:
How it’s designed to work under the hood is that multiple connections execute in a series, not in parallel.
The order is decided by which connector was connected first (and second, or third etc). It’s not a very discoverable fact admittedly, and something I hope we can improve in future.
Separately, if an individual node has an error then it will halt the workflow execution by default. This can be overridden by “Continue on fail” toggle in the node’s settings tab. I the error is then treated as node output data and can be handled (for example, an IF node afterwards that checks for an error).
The .+ matches any number of characters including whitespace. Surrounding it in parenthesis makes it a “capture group”. The match function returns an array where the first element is the entire match, and the remaining elements are the different capture groups you specify. So the [1] at the end grabs element 1 of the array.
A great tool for testing regular expressions is regex101.com. Just be sure to select ECMAScript as the “flavor” of regular expressions that you’re using so your expression will be javascript compatible.
Finally, it looks like you’re trying to scrape some data from a web site that requires you to grab a session key. Check out scraperbee.com, which has a whole system for scraping sites and extracting data. There’s a learning curve, but if you’re doing anything remotely complicated it’s worth it, and their support team is great.