python - scrapy for several websites -
i'm using scrapy crawl websites. in project, every spider has same code start_urls, domain , name.(it' means spider general spider, use crawl every websites.) aims :
- just use 1 spider(since every spider has same code), , set start_urls, domain , name dynamicly(maybe can these info database)
- run spider , make crawl several websites @ same time
- record log every website, example: website: ' www.hhhh.com' should has log file named 'hhhh_log'
can give me ideas?
Comments
Post a Comment