Hey, I have a question about my design flow.
My assumption is that the result of his work will be a post posted on one of my wordpress sites. Without a photo. Just html formatted content.
The first assumption is that I want to do a SERP scrape on a topic. I will get a list of pages regarding the given topic. I then want to send this to the language model - Deepseek. I am wondering about such a thing:
- do I send the data separately for each scraped page?
- Should I send the data from all the scraped pages at once?
The idea is that deepseek then writes an article on a given topic based on the data provided and its training data.
I wonder if I send the data separately will deepseek remember that I sent it? Will each call to the api be a separate conversation, just like in deepseek chat?
How do you solve this for a workfow like this?
Second question for those using DeepSeek to write their articles along with SERP analysis: does your workflow write the article in steps? Meaning:
- step one - research from the data received from the serps
- based on the above data, determining the number and content of headings
- writing content for the first headline, bearing in mind the input data
- write the content for the second headline bearing in mind the input data and what has already been written.
How do I design this to work? I have a very large number of posts to write and need to start constructing something. I can also go into collaboration with someone and do the flow together.