I don’t have an export option in my menu. Probably because I’m using the free plan. But here is a detailed summary of what I built, and some of the problems I’m encountering. It’s a lot, but I suspect there’s a much easier way to do this. I also think part of the problem is that I’m doing my best not to spend any money because i’m on disability and can’t really afford extra expenses for this project right now. I’m also using running this through digital ocean, if that’s relevant. I don’t know. I’ve been at this for a couple weeks now, and keep running into deadends. I’m feeling like giving up right now because I’m not sure I can pull this off. Your help would be greatly appreciated, especially if it can get me back on track. Thank you very much in advance <3
You’re right. I omitted the complete working pipeline. Here is the ENTIRE workflow as it existed before the pivot, including all nodes, configurations, and connections.
CRISIS MONITOR - COMPLETE WORKFLOW DOCUMENTATION
PIPELINE ARCHITECTURE (FULLY CONNECTED)
[Tickers] → [Loop Over Items] → [Wait 2s] → [Fetcher] → [IF: Route HTTP vs Search]
├─ True → [HTTP Request] → [Merge Metadata] → [PIK Analyzer]
└─ False → [PIK Analyzer]
↓
[Critical Alert Filter] → [Summary/View] → [Data Table] → [Aggregator] → [CSV Export]
NODE 1: TICKERS (Code Node)
Purpose: Define BDCs with their CIKs and known filings.
Full Script:
// Tickers to monitor - ORIGINAL 9 BDCs
const ALL_BDCS = [
{ cik: "1287750", title: "ARES CAPITAL CORPORATION", ticker: "ARCC", status: "active" },
{ cik: "1490136", title: "BLUE OWL CAPITAL INC.", ticker: "OBDC", status: "active" },
{ cik: "1551928", title: "BLACKSTONE SECURED LENDING FUND", ticker: "BXSL", status: "active" },
{ cik: "1422182", title: "FS KKR CAPITAL CORP", ticker: "FSK", status: "active" },
{ cik: "1476761", title: "GOLUB CAPITAL BDC INC.", ticker: "GBDC", status: "active" },
{ cik: "1291504", title: "MAIN STREET CAPITAL CORP", ticker: "MAIN", status: "active" },
{ cik: "1496096", title: "HERCULES CAPITAL INC.", ticker: "HTGC", status: "active" },
{ cik: "1408078", title: "NEW MOUNTAIN FINANCE CORP", ticker: "NMFC", status: "active" },
{ cik: "1396440", title: "PENNANTPARK INVESTMENT CORP", ticker: "PNNT", status: "active" }
];
// KNOWN WORKING FILINGS
const KNOWN_FILINGS = {
"1287750": [ // ARCC
{ accessionNumber: "0001287750-26-000004", formType: "8-K", filingDate: "2026-02-04" }
],
"1823945": [ // OBDC - discovered later, correct CIK
{ accessionNumber: "0001823945-26-000004", formType: "8-K", filingDate: "2026-02-05" }
]
};
const activeBDCs = ALL_BDCS.filter(bdc => bdc.status === "active");
const outputItems = [];
activeBDCs.forEach(bdc => {
const filings = KNOWN_FILINGS[bdc.cik];
if (filings && filings.length > 0) {
// Known filing with accession number
filings.forEach(filing => {
outputItems.push({
json: {
ticker: bdc.ticker,
company: bdc.title,
cik: bdc.cik,
accessionNumber: filing.accessionNumber,
formType: filing.formType,
filingDate: filing.filingDate,
needsHttpRequest: true
}
});
});
} else {
// No known filing - search entry
outputItems.push({
json: {
ticker: bdc.ticker,
company: bdc.title,
cik: bdc.cik,
accessionNumber: `SEARCH-${bdc.cik}`,
formType: "8-K",
filingDate: "2025-12-31",
needsHttpRequest: false,
note: "Need to find accession number"
}
});
}
});
return outputItems;
Output Fields: ticker, company, cik, accessionNumber, formType, filingDate, needsHttpRequest
NODE 2: LOOP OVER ITEMS (Item Lists Node)
Purpose: Process each BDC one at a time to maintain data isolation.
Configuration:
· Operation: Loop Over Items
· Batch Size: 1
· Options:
· Enable Loop: ✓ Yes
· Loop While: Continue until no items left
Connection: Tickers → Loop Over Items
NODE 3: WAIT (Wait Node)
Purpose: 2-second delay between SEC requests (rate limiting compliance).
Configuration:
· Operation: Wait
· Resume: After Time Interval
· Wait Amount: 2
· Wait Unit: Seconds
· Options:
· Keep Execution Alive: ✓ Yes
Connection: Loop Over Items → Wait
NODE 4: FETCHER (Code Node)
Purpose: Construct SEC URL from accession number and route items.
Full Script:
const items = $input.all();
const output = [];
items.forEach(item => {
const data = item.json;
// Check if this item needs HTTP request
if (data.needsHttpRequest === true && data.accessionNumber && data.accessionNumber.trim() !== '') {
const accession = data.accessionNumber.trim();
const parts = accession.split('-');
if (parts.length !== 3) {
output.push({
json: {
...data,
routeToHttp: false,
searchRequired: true,
note: `Invalid accession: ${accession}`
}
});
return;
}
const cikWithZeros = parts[0];
const twoDigitYear = parts[1];
const serial = parts[2];
// Remove leading zeros from CIK for URL path
const cikNoZeros = cikWithZeros.replace(/^0+/, '');
const paddedSerial = serial.padStart(6, '0');
const coreAccession = cikWithZeros + twoDigitYear + paddedSerial;
// CORRECT SEC URL FORMAT
const secUrl = `https://www.sec.gov/Archives/edgar/data/${cikNoZeros}/${coreAccession}/${accession}.txt`;
output.push({
json: {
// Preserve ALL original fields
ticker: data.ticker,
company: data.company,
cik: cikWithZeros,
accessionNumber: accession,
formType: data.formType,
filingDate: data.filingDate,
needsHttpRequest: true,
// HTTP routing
routeToHttp: true,
url: secUrl,
coreAccession: coreAccession,
_debugUrl: secUrl,
// Headers for HTTP Request node
headers: {
'User-Agent': 'Crisis Monitor/1.0 [email protected]',
'Accept': 'text/plain'
},
timeout: 30000
}
});
} else {
// Search entry - no HTTP needed
output.push({
json: {
...data,
routeToHttp: false,
searchRequired: true,
note: data.needsHttpRequest ? 'Missing accession' : 'Search entry'
}
});
}
});
return output;
Output Fields: All input fields plus routeToHttp, url, coreAccession, headers, timeout
NODE 5: IF NODE (Route HTTP vs Search)
Purpose: Split workflow based on whether item needs HTTP request.
Configuration:
· Condition: Boolean
· Value 1: {{ $json.routeToHttp }}
· Operation: Equals
· Value 2: true
Connections:
· True Branch (routeToHttp: true) → HTTP Request
· False Branch (routeToHttp: false) → PIK Analyzer (bypass HTTP)
NODE 6: HTTP REQUEST (HTTP Request Node)
Purpose: Fetch SEC filing text.
Configuration:
· Method: GET
· URL: ={{$json.url}}
· Headers:
· User-Agent: Crisis Monitor/1.0 [email protected]
· Accept: text/plain
· Options:
· Response Format: Text
· Response: ✓ Yes
· Full Response: ✓ Yes
· Never Error: ✓ Yes
· Ignore SSL Issues: ✓ Yes
· Timeout: 30000
· Follow Redirects: ✓ Yes
· Max Redirects: 5
· Output Property Name:
· Put Output in Field: data
· Settings:
· Continue on Fail: ✓ Yes
Known Issue: HTTP Request strips all metadata fields (ticker, company, cik, etc.). Only data field remains.
NODE 7: MERGE METADATA (Code Node)
Purpose: Restore metadata stripped by HTTP Request node.
Full Script:
const items = $input.all();
const output = [];
items.forEach(item => {
const data = item.json;
// Get paired item (original item before HTTP Request)
// This is CRITICAL - without this, metadata is lost
const pairedItem = item.pairedItem;
let originalData = {};
if (pairedItem && pairedItem.json) {
originalData = pairedItem.json;
}
// Merge: original metadata + HTTP response data
const mergedData = {
// Original metadata from Fetcher (via pairedItem)
ticker: originalData.ticker || data.ticker,
company: originalData.company || data.company,
cik: originalData.cik || data.cik || "",
accessionNumber: originalData.accessionNumber || data.accessionNumber,
formType: originalData.formType || data.formType,
filingDate: originalData.filingDate || data.filingDate,
routeToHttp: originalData.routeToHttp || data.routeToHttp,
url: originalData.url || data.url,
coreAccession: originalData.coreAccession || data.coreAccession,
// HTTP response data
data: data.data || "", // SEC filing text
responseStatusCode: data.responseStatusCode || 0,
// Headers
headers: originalData.headers || data.headers,
timeout: originalData.timeout || data.timeout
};
output.push({
json: mergedData,
pairedItem: item.pairedItem
});
});
return output;
Connection: HTTP Request → Merge Metadata → PIK Analyzer
NODE 8: PIK ANALYZER (Code Node)
Purpose: Detect PIK toggle language in SEC filings.
Full Script:
const input = $input.first();
const item = input.json;
// DEBUG - Log input
console.log(`PIK Analyzer received: ${item.ticker}, CIK: ${item.cik}, data length: ${item.data ? item.data.length : 0}`);
// Handle search entries (bypassed HTTP)
if (item.routeToHttp === false) {
return [{
json: {
// Original metadata
ticker: item.ticker,
company: item.company,
cik: item.cik || "",
accessionNumber: item.accessionNumber,
formType: item.formType,
filingDate: item.filingDate,
quarter: "Q4 2025",
source: "search_needed",
// Empty analysis
filingText: "",
contentLength: 0,
pikTermCount: 0,
pikTermsFound: "",
confidence: 0,
alertLevel: "SEARCH_NEEDED",
fetchStatus: "search_needed",
timestamp: new Date().toISOString(),
// CSV fields
"Ticker": item.ticker,
"Company": item.company,
"Filing Date": item.filingDate,
"Form Type": item.formType,
"Accession Number": item.accessionNumber,
"PIK Detected": "NO",
"Alert Level": "SEARCH_NEEDED",
"Confidence %": 0,
"PIK Term Count": 0,
"PIK Terms Found": "",
"Filing URL": "",
"Analysis Date": new Date().toISOString(),
"Status": "search_needed",
"Is Test Data": false,
"Notes": "Accession number needed"
}
}];
}
// Handle HTTP failures
const filingText = item.data || "";
const contentLength = filingText.length;
if (!filingText || contentLength === 0) {
return [{
json: {
ticker: item.ticker,
company: item.company,
cik: item.cik || "",
accessionNumber: item.accessionNumber,
formType: item.formType,
filingDate: item.filingDate,
quarter: "Q4 2025",
source: "http_fetch",
filingText: "",
contentLength: 0,
pikTermCount: 0,
pikTermsFound: "",
confidence: 0,
alertLevel: "FETCH_ERROR",
fetchStatus: "fetch_failed",
timestamp: new Date().toISOString(),
"Ticker": item.ticker,
"Company": item.company,
"Filing Date": item.filingDate,
"Form Type": item.formType,
"Accession Number": item.accessionNumber,
"PIK Detected": "ERROR",
"Alert Level": "FETCH_ERROR",
"Confidence %": 0,
"PIK Term Count": 0,
"PIK Terms Found": "",
"Filing URL": item.url || "",
"Analysis Date": new Date().toISOString(),
"Status": "fetch_failed",
"Is Test Data": false,
"Notes": "No filing text received"
}
}];
}
// PIK DETECTION LOGIC
const pikTerms = {
"payment-in-kind": 25,
"PIK": 25,
"PIK toggle": 30,
"PIK interest": 25,
"paid-in-kind": 20,
"payable in kind": 20,
"kind securities": 15,
"kind dividend": 15,
"toggle option": 15
};
let pikTermCount = 0;
let foundTerms = [];
const lowerText = filingText.toLowerCase();
Object.entries(pikTerms).forEach(([term, weight]) => {
const regex = new RegExp(term, 'gi');
const matches = lowerText.match(regex);
if (matches) {
pikTermCount += matches.length;
foundTerms.push(`${term} (${matches.length}x)`);
}
});
// Confidence scoring
let confidence = 0;
let alertLevel = "CLEAN";
let notes = "";
if (pikTermCount >= 5) {
confidence = 85;
alertLevel = "HIGH";
notes = "Multiple PIK terms detected";
} else if (pikTermCount >= 3) {
confidence = 65;
alertLevel = "MEDIUM";
notes = "Several PIK terms detected";
} else if (pikTermCount >= 1) {
confidence = 40;
alertLevel = "LOW";
notes = "Few PIK terms detected";
} else {
confidence = 0;
alertLevel = "CLEAN";
notes = "No PIK terms found";
}
// Build filing URL
let filingUrl = item.url || "";
if (!filingUrl && item.cik && item.coreAccession && item.accessionNumber) {
const cleanCik = String(item.cik).replace(/^0+/, '');
filingUrl = `https://www.sec.gov/Archives/edgar/data/${cleanCik}/${item.coreAccession}/${item.accessionNumber}.txt`;
}
console.log(`✅ ${item.ticker}: ${alertLevel}, ${pikTermCount} PIK terms`);
return [{
json: {
// METADATA - CRITICAL: MUST INCLUDE CIK
ticker: item.ticker,
company: item.company,
cik: item.cik || "", // THIS WAS OFTEN MISSING
accessionNumber: item.accessionNumber,
formType: item.formType,
filingDate: item.filingDate,
quarter: "Q4 2025",
source: "http_fetch",
// Analysis results
filingText: filingText.length > 5000 ? filingText.substring(0, 5000) + "..." : filingText,
contentLength: contentLength,
pikTermCount: pikTermCount,
pikTermsFound: foundTerms.join("; "),
confidence: confidence,
alertLevel: alertLevel,
fetchStatus: "analyzed",
timestamp: new Date().toISOString(),
needsAnalysis: pikTermCount > 0,
// CSV EXPORT FIELDS
"Ticker": item.ticker,
"Company": item.company,
"Filing Date": item.filingDate,
"Form Type": item.formType,
"Accession Number": item.accessionNumber,
"PIK Detected": pikTermCount > 0 ? "YES" : "NO",
"Alert Level": alertLevel,
"Confidence %": confidence,
"PIK Term Count": pikTermCount,
"PIK Terms Found": foundTerms.join("; "),
"Filing URL": filingUrl,
"Analysis Date": new Date().toISOString(),
"Status": "analyzed",
"Is Test Data": false,
"Notes": notes
}
}];
Known Issue: CIK field often missing in output despite being present in input. Caused by:
- HTTP Request stripping metadata
- Merge node not fully restoring
- PIK analyzer output omitting cik: item.cik
NODE 9: CRITICAL ALERT FILTER (Code Node)
Purpose: Filter for HIGH/MEDIUM alerts.
Full Script:
const items = $input.all();
const output = [];
items.forEach(item => {
const data = item.json;
// Include HIGH and MEDIUM alerts
if (data.alertLevel === "HIGH" || data.alertLevel === "MEDIUM") {
output.push({
json: {
...data,
critical: true,
reviewPriority: data.alertLevel === "HIGH" ? 1 : 2,
requiresAction: true
}
});
}
});
// If no critical alerts, pass through but mark as normal
if (output.length === 0) {
items.forEach(item => {
const data = item.json;
output.push({
json: {
...data,
critical: false,
reviewPriority: 3,
requiresAction: false
}
});
});
}
return output;
NODE 10: SUMMARY/VIEW (Code Node)
Purpose: Create executive dashboard view.
Full Script:
const items = $input.all();
const summary = {
totalAnalyzed: items.length,
highAlert: 0,
mediumAlert: 0,
lowAlert: 0,
clean: 0,
error: 0,
searchNeeded: 0,
companies: [],
timestamp: new Date().toISOString(),
crisisScore: 0
};
items.forEach(item => {
const data = item.json;
// Count alerts
if (data.alertLevel === "HIGH") summary.highAlert++;
else if (data.alertLevel === "MEDIUM") summary.mediumAlert++;
else if (data.alertLevel === "LOW") summary.lowAlert++;
else if (data.alertLevel === "CLEAN") summary.clean++;
else if (data.alertLevel === "FETCH_ERROR") summary.error++;
else if (data.alertLevel === "SEARCH_NEEDED") summary.searchNeeded++;
// Company details
summary.companies.push({
ticker: data.ticker,
alertLevel: data.alertLevel,
confidence: data.confidence,
pikCount: data.pikTermCount
});
});
// Crisis score (0-100)
const totalWeighted = (summary.highAlert * 10) + (summary.mediumAlert * 5) + (summary.lowAlert * 1);
const maxPossible = summary.totalAnalyzed * 10;
summary.crisisScore = maxPossible > 0 ? Math.round((totalWeighted / maxPossible) * 100) : 0;
return [{
json: {
summary: summary,
details: items.map(i => i.json)
}
}];
NODE 11: DATA TABLE (Code Node)
Purpose: Format data for table display.
Full Script:
const input = $input.first();
const summary = input.json.summary;
const details = input.json.details;
const tableData = details.map(item => ({
Ticker: item.ticker,
Company: item.company,
"Filing Date": item.filingDate,
"Form Type": item.formType,
"PIK Terms": item.pikTermCount,
"Confidence %": item.confidence,
"Alert Level": item.alertLevel,
"Status": item.fetchStatus,
"URL": item.url || ""
}));
return [{
json: {
summary: summary,
table: tableData,
displayMode: "table"
}
}];
NODE 12: AGGREGATOR (Code Node)
Purpose: Collect all looped items into single array for CSV export.
Full Script:
const items = $input.all();
const allAnalyses = [];
items.forEach(item => {
const data = item.json;
// Only include analyzed data (not search entries or errors)
if (data.Ticker && data.Ticker !== "UNKNOWN" && data.alertLevel !== "SEARCH_NEEDED" && data.alertLevel !== "FETCH_ERROR") {
allAnalyses.push({
Company: data.Company || data.company,
Ticker: data.Ticker || data.ticker,
"Filing Date": data["Filing Date"] || data.filingDate,
"Form Type": data["Form Type"] || data.formType,
"Accession Number": data["Accession Number"] || data.accessionNumber,
"PIK Detected": data["PIK Detected"] || (data.pikTermCount > 0 ? "YES" : "NO"),
"Alert Level": data["Alert Level"] || data.alertLevel,
"Confidence %": data["Confidence %"] || data.confidence,
"PIK Term Count": data["PIK Term Count"] || data.pikTermCount,
"PIK Terms Found": data["PIK Terms Found"] || data.pikTermsFound,
"Filing URL": data["Filing URL"] || data.url,
"Analysis Date": data["Analysis Date"] || data.timestamp,
"Status": data.Status || data.fetchStatus,
"Notes": data.Notes || data.notes || ""
});
}
});
return [{
json: {
allAnalyses: allAnalyses,
count: allAnalyses.length,
exportTimestamp: new Date().toISOString()
}
}];
NODE 13: CSV EXPORT (Code Node)
Purpose: Create downloadable CSV file.
Full Script:
const allAnalyses = $input.item.json.allAnalyses;
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
if (!allAnalyses || allAnalyses.length === 0) {
const emptyCSV = "Company,Ticker,Alert Level,Confidence %,PIK Detected,PIK Terms Found,Filing Date,Accession Number,Filing URL,Analysis Date,Status,Notes\nNo data available";
return [{
json: { message: "No data to export", timestamp: timestamp },
binary: {
file: await this.helpers.prepareBinaryData(Buffer.from(emptyCSV, 'utf8'), `crisis-monitor-${timestamp}.csv`, 'text/csv')
}
}];
}
const headers = ['Company','Ticker','Alert Level','Confidence %','PIK Detected','PIK Terms Found','Filing Date','Accession Number','Filing URL','Analysis Date','Status','Notes'];
const csvRows = [headers.join(',')];
allAnalyses.forEach(item => {
const row = headers.map(header => {
let value = item[header] || '';
const stringValue = String(value);
if (stringValue.includes(',') || stringValue.includes('"') || stringValue.includes('\n')) {
return '"' + stringValue.replace(/"/g, '""') + '"';
}
return stringValue;
});
csvRows.push(row.join(','));
});
const csvString = csvRows.join('\n');
const filename = `crisis-monitor-export-${timestamp}.csv`;
return [{
json: {
exportTimestamp: timestamp,
recordCount: allAnalyses.length,
filename: filename,
downloadAvailable: true
},
binary: {
file: await this.helpers.prepareBinaryData(Buffer.from(csvString, 'utf8'), filename, 'text/csv')
}
}];
NODE CONNECTIONS (COMPLETE)
1. Tickers → Loop Over Items
2. Loop Over Items → Wait
3. Wait → Fetcher
4. Fetcher → IF Node (Route HTTP vs Search)
5. IF Node (True) → HTTP Request
6. HTTP Request → Merge Metadata
7. Merge Metadata → PIK Analyzer
8. IF Node (False) → PIK Analyzer
9. PIK Analyzer → Critical Alert Filter
10. Critical Alert Filter → Summary/View
11. Summary/View → Data Table
12. Data Table → Aggregator
13. Aggregator → CSV Export
KNOWN TECHNICAL ISSUES (SUMMARY)
Issue 1: CIK Metadata Loss
Symptoms: CIK field present in Fetcher and HTTP Request input, missing in PIK Analyzer output.
Root Causes:
· HTTP Request node strips all fields except those in Put Output in Field
· Include Input Data option not available in n8n 2.6.3
· pairedItem reference works inconsistently
· PIK Analyzer output omits cik: item.cik
Attempted Solutions (All Failed):
- Merge node using pairedItem.json
- _metadata JSON string field
- Spread operator copying all fields
- Explicit cik: item.cik in output
Issue 2: SEC JSON API Returns Zero Filings
Symptoms: Total Filings: 0 for all companies in current surveillance pipeline.
Suspected Causes:
· API structure differs from filings.recent.form
· SEC blocking n8n requests despite User-Agent
· Rate limiting (no delay between API calls)
· Missing Accept: application/json header
Issue 3: 8-K Critical Item Detection Not Implemented
Requirement: Filter 8-Ks for Items 2.04, 1.03, 2.06. Not yet coded.
WORKING COMPONENTS (CONFIRMED)
- SEC URL Construction: https://www.sec.gov/Archives/edgar/data/{cikNoZeros}/{coreAccession}/{accession}.txt ✓
- HTTP Request to SEC: With proper User-Agent and 2s delay ✓
- PIK Detection Logic: Finds PIK terms, scores confidence ✓
- PIK Detection Results: ARCC (2 terms, LOW), OBDC (23 terms, HIGH) ✓
- CSV Export: Binary download via n8n UI ✓
- Loop with Wait: 2-second rate limiting ✓
CURRENT STATUS (AS OF FEB 10, 2026)
Full Filing Fetch Pipeline: Partially working - fetches filings, detects PIK, but CIK metadata loss prevents reliable CSV export.
SEC JSON Surveillance Pipeline: Not working - returns zero filings.
Pivot Strategy: Archive Builder (URL spreadsheet) is fallback.
END OF COMPLETE WORKFLOW DOCUMENTATION