python - scrapy for several websites -


i'm using scrapy crawl websites. in project, every spider has same code start_urls, domain , name.(it' means spider general spider, use crawl every websites.) aims :

  1. just use 1 spider(since every spider has same code), , set start_urls, domain , name dynamicly(maybe can these info database)
  2. run spider , make crawl several websites @ same time
  3. record log every website, example: website: ' www.hhhh.com' should has log file named 'hhhh_log'

can give me ideas?


Comments

Popular posts from this blog

javascript - Jquery show_hide, what to add in order to make the page scroll to the bottom of the hidden field once button is clicked -

javascript - Highcharts multi-color line -

javascript - Enter key does not work in search box -