Hey everyone, I’m building an automated content pipeline and I need to scrape Instagram posts/reels for inspiration and trend research. I’ve looked at a few options but haven’t found one that checks all my boxes.
What I need:
- Scrape posts or reels from hashtags, profiles, or the explore feed
- Be able to pass a
min_views parameter (or equivalent) directly in the API request — so I only get back content above a certain view count, without having to filter it myself after the fact
- A proper REST API or SDK (Python preferred) — not just a GUI tool
- Reliable enough for production use in an n8n or automation workflow
Tools I’ve already tried or looked at:
Tools I’ve already tried or looked at:
- Apify and Scraper Creator (Instagram actors) — good but can’t filter by views natively in the request params
- Phantombuster — limited API control, more GUI-focused
- Instaloader — great for personal use but not built for production pipelines
Ideally something like:
GET /scrape/hashtag?tag=your_hashtag&min_views=100000&limit=20
Or even better, a Python SDK where I can do:
results = scraper.get_reels(hashtag=“your_hashtag”, min_views=100000, limit=20)
The min_views filter at the request level is the key requirement — I want the service to handle that, not me pulling 500 results and filtering down to 5.
If you’ve used something that handles this well in a real pipeline, I’d love to hear what worked for you — paid tools are fine as long as they have a solid API.
Thanks!
Hey @JoseAI-Automatizacio! I’ve run into this exact ‘pre-filtering’ frustration before.
The reality with Instagram’s private API (which most scrapers use) is that ‘min_views’ isn’t a native server-side filter parameter that Instagram themselves expose. That’s why even the big players like Apify usually require you to scrape a batch and then filter locally.
However, if you’re looking for a production-ready REST API that handles this better, check out Social Analyzer or RocketAPI. They don’t always have a min_views param in the GET request, but their response times are fast enough that you can easily script the filter in an n8n ‘Code’ node or a Python script without hitting massive latency.
If you’re using n8n, the move is:
- HTTP Request to the scraper (get 50-100 results).
- Filter Node (or Code Node) immediately after: ‘return item.json.video_view_count > 100000;’
- Since you only pay for the successful request, and the ‘filtering’ happens in milliseconds in n8n, it effectively gives you what you want.
If you absolutely MUST have it at the request level to save on ‘credits’ with a provider, some specialized ‘Growth’ APIs (like Influxy) claim to do this, but they’re often less stable than the standard scraping infrastructure. I’d stick with a solid scraper + a quick filter node in your workflow for reliability.
Hey! I had the exact same requirement a few weeks ago — needed to monitor Instagram content for trend research without hitting API rate limits constantly.
What worked well for me: a hybrid n8n workflow that uses the Instagram Graph API + a caching layer in Google Sheets so you don’t re-scrape the same posts.
The key nodes I use:
- HTTP Request → Instagram Graph API (for accounts you follow/own)
- RapidAPI Instagram scraper as fallback for public hashtag data
- GPT-4o-mini to score each post by engagement potential (likes+comments/followers ratio)
- Filter node to only pass posts above your minimum views threshold
I packaged 5 of these social media automation workflows (including an Instagram DM Autoresponder) into a ready-to-import n8n pack — grab it here (PWYW, just set $0): Social Media AI Automation Pack - 5 n8n Workflows (GPT-4o-mini)
The JSON imports directly into n8n, no setup beyond adding your API keys. Happy to answer questions about the engagement scoring logic if you want to customize the threshold.