I was wondering what is the best way to run spiders from another python script. My scrapy project consist of 4 different spiders, all of them create files that help the other spiders work and some of them have to read some files to work. That part is already done but individually (running the spiders separate from the console).
How can I, for example, do something like this
if (productToSearchIsBlue):
#Make one spider crawl
else:
#Make another spider crawl
My final plan is to upload the full program to the cloud and make it run automatically, can this be done?
I found some answers to this question but they were pretty old, probably for another version of scrapy.