How do I know Scrapy custom settings are applied? Log shows general settings -
i use scrapy 1.3 according documentation can declare settings each specific spider simple enough. here code
class bhspider(scrapy.spider): name = "code" custom_settings = {'concurrent_requests' : '20', 'feed_export_fields' :['price', 'stock','partnumber','sku', 'name' ,'manufacture','attribute','distributor','upc','descr', 'p_url','main_image','images']} allowed_domains = ***
but according logs project settings used. how know project settings overcomed specific spider settings?
here log
2017-08-20 13:47:51 [scrapy.utils.log] info: scrapy 1.3.2 started (bot: pc) 2017-08-20 13:47:51 [scrapy.utils.log] info: overridden settings: {'newspider_module': 'pc.spiders' , 'feed_uri': 'test_output1.csv', 'concurrent_requests': 200, 'spider_modules': ['pc.spiders'], 'bo t_name': 'pc', 'user_agent': 'mozilla/5.0 (windows nt 6.1) applewebkit/537.36 (khtml, gecko) c hrome/59.0.3071.115 safari/537.36', 'feed_format': 'csv', 'feed_export_fields': ['price', 'stock', 'partnumber', 'sku', 'name', 'manufacture', 'attribute', 'distributor', 'upc', 'descr']} 2017-08-20 13:47:51 [scrapy.middleware] info: enabled extensions:
in output file export fields seem correct( in custom settings, though log not show that). how can know specific settings enabled?
Comments
Post a Comment